EU Research winter 2015/16

Page 1

EU Research Winter 2015 / 2016

EXPANDING THE HORIZON

The philosophy of Self control Conquering Complexity in Cardiovascular Disease

Suicide Gene Therapy proven to kill cancer cells

Disseminating the latest research under FP7 and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH



Editor’s No C linicians’ approach to healthcare solutions may achieve the equivalent of a quantum leap in a short while. For instance, we are on the cusp of creating new kinds of computer simulations for patient specific treatments for a range of physiological conditions.

By using state of the art computer imaging and simulations based on an individual’s physiology and health conditions we can aspire to be accurate in our analysis and predictive of the best care scenario, without multiple operations, extensive biopsies, drug testing etc. In another way, testing new drugs and the effects they have on organs has long involved the suffering of animals, or long clinical trials and yet computer simulation technology has the potential to change this to simply map and model the pathways and outcomes of new potential treatments.

This is a branch of research that could therefore dramatically change lives.

As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years. He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.

Research is still on-going but today the issue is not so much ‘can it be done’ – the real issue will be refining and improving, getting the processes integrated at the clinics - it’s a question of accessibility, rollout, affordability and training also. As long as we can cross over the two different cultures of IT and healthcare successfully this is a future health-scape worth achieving. Technology can provide a new suite of tools for clinicians which could revolutionise care. With an exponentially expanding, as well as an ageing population, resources and time saved are of paramount importance, so efficiencies in treatment for serious health challenges are also of greater value. Hope you enjoy the issue.

Richard Forsyth Editor

www.euresearcher.com

1


Contents 30 GT-SKIN

4 Research News

EU Research’s Richard Davey takes a look at current events in the scientific news

10 Beads on String

Dr Yuval Ebenstein tells us about the BeadsOnString project’s work in developing a toolbox to reveal genetic and epigenetic variation in genomic DNA and chromatin

12 BeyondSeq Project Holding significant implications for research into the underlying causes of disease, the technology being developed within BeadsOnString will also be used in the BeyondSeq project, an ECfunded initiative that Dr Yuval Ebenstein is also coordinating.

13 Neural Messages of Dynapedia

Dr Thierry Keller explains that while robots are already used to provide certain types of therapy, researchers in the European Network on Robotics for NeuroRehabilitation aim to improve them further Dr Hartmut Rütten explains

16 Systematics Neural stem cells maintain the capacity to divide and generate differentiated neurons in the adult brain, helping the central nervous system to adapt to new challenges, yet their abundance declines as we age. We spoke to Dr Laure Bally-Cuif about the SYSTEMATICS project’s research into the maintenance and mobilisation of neural stem cells

2

19 Arterial Spin Labelling in Dementia

The ASL in Dementia project aims to develop a cost effective tool to measure the amount of blood delivered to specific areas of the brain, which is an important indicator of dementia, as Professor Xavier Golay explains

22 Encoding in Axons Improved optical methods mean neuroscientists can study individual nerve cells in greater detail than ever before. This work at the cellular level could lead to important insights into how information is stored and coded within the brain, as well as the development of neurodegenerative disease, as Professor Maarten Kole explains

25 IndivuHeart The incidence of heart failure is rising, yet our understanding of it remains relatively limited. Professor Thomas Eschenhagen tells us about the IndivuHeart project’s work in developing a clinically applicable test, which could be a step towards individualised risk prediction and therapy for heart failure

28 The Role of

Dopamine Receptors The D2 dopamine receptor in the kidney is thought to protect against renal injury and inflammation, while genetic variants of the receptor can lead to a decrease in its expression. The presence of these variants can mean people are more susceptible to renal inflammation, fibrosis and ultimately renal injury, as Dr Ines Armando of George Washington University explains

Researchers continue to investigate new gene therapy methods, aiming to develop more effective means of correcting genes as a way of treating disease. Dr Fulvio Mavilio, coordinator of the GT-SKIN project, tells us about their work in developing a new gene targeting and gene correction technology, which could be used in the future to treat genetic diseases

32 Patient Specific Treatments

From computer simulations to growing artificial heart muscles – there are some ground breaking projects underway that intend to refine our ability to understand and manage the diseases which affect so many – but in a way that is personalised to each patient. By Richard Forsyth

37 VisCul The traditional approach in the computer vision field has been to learn about individual concepts in isolation, after which the computer starts again from scratch. The VISCUL project is developing a new approach where a computer uses its pre-existing knowledge when learning about new concepts, as Principal Investigator Dr Vittorio Ferrari explains

40 FractFrict Crack propagation is the most common cause of material failure, now researchers are using innovative real-time measurements to gain new insights into the singular fields found at the tips of propagating cracks. This research holds real importance to our understanding of both material strength and earthquake dynamics, as Professor Jay Fineberg explains

41 Disaster Bioethics Ethical dilemmas frequently arise in the aftermath of natural disasters, yet few resources are specifically available to help guide people on the ground in their decision-making. We spoke to Dr Dónal O’Mathúna

EU Research


42 Carbonsink The primary objective of the Carbonsink project is to resolve the fate of the carbon deposited in marine sediments and oceanic crust. We spoke to project coordinator Dr Sasha Turchyn about their research into how the removal of carbon from the earth’s surface is regulated over geological timescales

43 Across Borders Sai Island, located between the second and third cataracts of the Nile in modern-day Sudan, was occupied by Egypt during the period of the Egyptian New Kingdom. Artefacts and archaeological remains on the island will offer new insights into settlement patterns in Upper Nubia during Pharaonic times, as Professor Julia Budka of the AcrossBorders project explains

46 Sefira The European Commission recently financed a study to review current air quality legislation and to assess how their public acceptability can influence the effectiveness of future policies. This is a context in which the work of the SEFIRA project takes on real importance, as project coordinator Professor Michela Maione explains

48 Philosophical

Collaboration Research in the physical sciences is often built on collaboration, yet there tends to be a different approach in the arts and humanities, where papers are largely attributed to one individual. Academics at University College London have introduced an experimental module this term for philosophy students, which will place greater emphasis on collaboration

52 MC@NNLO The Large Hadron Collider (LHC) has recently been re-started, with experiments at higher energies promising to generate new insights into the interactions of fundamental particles. The MC@ NNLO project aims to establish a new level of theoretical precision to describe the data generated by particle collider experiments, as Professor Nigel Glover explains

www.euresearcher.com

54 Phelix The PHELIX project is drawing inspiration from the natural world as researchers seek to develop biological strategies to design and produce smart materials based on the structure of the helix. These materials of the future will adapt to changes in their environment and drive a paradigm shift in our engineering philosophy, as project coordinator Professor Nathalie Katsonis explains

58 Reactafire While many architects are keen to make greater use of timber in construction, they’re currently held back by its fire protection performance. Darren Atkins and Clive Atkins tell us how the Reactafire project’s work in developing a unique advanced coating system will help improve structural fire protection for timber

60 TeamControl There is a clear disciplinary divide in the study of self-control, with philosophers emphasising willpower, while economists look towards external mechanisms by which an individual can commit their future self. The TeamControl project aim to bring elements of several disciplines together to develop a novel account of how we achieve self-control, as Dr Natalie Gold explains

62 Sastravid SV-RT Effective online tools can enable students to rapidly find information on concepts and ideas central to their studies. The Sastravid project is developing a new web-based research tool under the direction of Dr Jan Westerhoff, which will help students of Indian philosophy to explore both the explicit connections between texts and their conceptual connections, as Dr David Gold explains

EDITORIAL Managing Editor Richard Forsyth info@euresearcher.com Deputy Editor Patrick Truss patrick@euresearcher.com Deputy Editor Richard Davey rich@euresearcher.com Science Writer Holly Cave www.hollycave.co.uk Acquisitions Editor Elizabeth Sparks info@euresearcher.com PRODUCTION Production Manager Jenny O’Neill jenny@euresearcher.com Production Assistant Tim Smith info@euresearcher.com Art Director Daniel Hall design@euresearcher.com Design Manager David Patten design@euresearcher.com Illustrator Martin Carr mary@twocatsintheyard.co.uk PUBLISHING Managing Director Edward Taberner etaberner@euresearcher.com Scientific Director Dr Peter Taberner info@euresearcher.com Office Manager Janis Beazley info@euresearcher.com Finance Manager Adrian Hawthorne info@euresearcher.com Account Manager Jane Tareen jane@euresearcher.com

EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom T: +44 (0)207 193 9820 F: +44 (0)117 9244 022 E: info@euresearcher.com www.euresearcher.com © Blazon Publishing June 2010

Cert o n.TT-COC-2200

3


RESEARCH

NEWS

EU Research’s Richard Davey takes a look at current events in the scientific news

European Commission announce Circular Economy plan New waste targets are deemed weak and lacking ambition by critics

The EU circular economy package that has been in the works for a year was finally launched yesterday. Promising to be a “more ambitious” set of proposals than the original plans that were controversially scrapped by the European commission a year ago, the new package has received a mixed reception. Our current economic model is described as linear - we take raw materials, use them to make things, dispose of those things, then repeat. It’s a model that depends on large quantities of cheap materials and energy, and it’s hugely unsustainable, particularly as the world’s population grows along with spending power. A circular economy which has been championed by celebrities such as Brad Pitt and Will.i.am, requires whole systems to change, creating business models that are not predicated on waste and which purposefully design products so they are suitable for repair, re-use or remake. A recent report from the Ellen MacArthur Foundation, McKinsey and SUN (a non-profit organisation set up by the Deutsche Post Foundation) estimates that a circular economy could allow Europe to grow resource productivity by up to 3% annually, creating a net benefit of €1.8tn (£1.27tn) by 2030. The report also suggests that a circular economy would increase the average disposable income for EU households by €3,000 (£2,110).

4

There’s much criticism for what many see as a weak, watereddown package. Bas Eickhout, vice-president of The Greens/ European Free Alliance, stated: “A year on from the initial decision by the commission to withdraw its original proposals, we have lost both time and ambition in the push to stimulate the circular economy at EU level”. Joan Marc Simon, executive director of Zero Waste Europe (ZWE), points to minor improvements to the package such as the introduction of a system to monitor residual waste, and the promotion of reuse of electrical equipment, textiles and furniture, but on the whole, believes the package is too weak to get to a circular economy. Carlos Moedas, European commissioner for research, science and innovation said the targets set out a genuine path forward. “Because they are credible, these targets provide the private sector with the long-term certainty that will trigger investment and a lasting change in economic models.” The package will now be discussed between the European commission, the European parliament and the European council before a final agreement sometime in 2016.

EU Research


‘Suicide Gene Therapy’ kills Prostate cancer cells A technique in which prostate cancer cells are genetically modified so they signal a patient’s immune system to attack them - provides an effective punch against the disease, new research shows. Results from a long-term clinical trial conducted by cancer researchers at Houston Methodist Hospital show that combining radiation treatment with “suicide gene therapy,” a technique in which prostate cancer cells are genetically modified so they signal a patient’s immune system to attack them, provides a safe and effective one-two punch against the disease. “We strategically used an adenovirus, similar to the one that causes the common cold, to carry the therapy agent--a herpes virus gene

that produces the enzyme thymidine kinase, or TK--directly into the tumor cells,” said E. Brian Butler, M.D., chair of the Department of Radiation Oncology. He continued that once the activated valacyclovir starts destroying tumor cells, it also alerts the patient’s immune system, previously unaware of the cancer’s presence, that it is time to launch a massive attack. A Phase III patient trial, the final safety and efficacy evaluation for the in-situ immunomodulatory gene therapy before it can be approved by the Food and Drug Administration, is already underway.

Make sure you vist us on our website www.euresearcher.com. For more information regarding any of your dissemination needs please contact us on info@euresearcher.com

CO2 Emissions are declining shows latest research

Preliminary estimates from an international group of scientists show they may have actually fallen by 0.6 per cent in 2015 “These figures are certainly not typical,” said Professor Corinne Le Quéré of the UK’s University of East Anglia, one of the authors of the analysis published on Monday in the Nature Climate Change journal. She said a stalling of emissions had not coincided with a year of more than 2-3 per cent economic growth since reliable records became available in the 1970s. The chief reason for the fall, the scientists said, is the slowdown in coal use in China. The country is the world’s largest carbon polluter by far, responsible for 27 per cent of total world emissions in 2014. China’s emissions had been rising 6.7 per cent a year over the previous decade but this growth slowed to 1.2 per cent in 2014 and emissions are expected to fall by as much as 3.9 per cent in 2015, the researchers said. That is largely because of a fall in coal consumption in at least the

www.euresearcher.com

first eight months of 2015. Emissions also fell in the US and the EU, the second and third largest carbon polluters, with a 15 per cent and 10 per cent share of emissions respectively. But the scientists warned that it is too early to say global emissions have definitely peaked because other big emerging economies are still planning to burn large amounts of coal. The decline in emissions from the EU in 2014, for example, matched the rise in pollution in India, the fourth-largest emitter, with just over 7 per cent of the global total. “I would be very surprised if global emissions have peaked,” said Chris Field, a senior author of the latest report from the UN’s Intergovernmental Panel on Climate Change, who was not involved with the latest research. But the data do show they can peak, he added.

5


Irish Government Announce €5 billion Research Spending Programme The strategy, titled Innovation 2020, also details a five year plan for science and technology which doubles the current amount of investment in research and development One of the main objectives of the strategy is to increase total investment in R&D in Ireland to 2.5pc of gross national product (GNP). Should the increase be successful it would incur a €2.1bn jump in current R&D levels which hopes to see an increase of 30% in research masters and PhD enrolments. Speaking about the strategy, minister for research, innovation and skills, Damien English, said that developing the country’s talent is critical. “Our success in delivering on our vision will depend on our people - undertaking the research, working in and creating successful enterprises, and contributing to the society in which we live. We will support talent development from primary level through to Postdoctoral research and from frontier research across all disciplines to practical application. We will

eCigarettes are safer than originally believed Research shows they pump out vapour which has NO toxic effect on the cells found in human lungs Fresh research has suggested inhaling nicotine vapour could be as safe as breathing air. Scientists used a “smoking robot” to expose lung cell replicas to tobacco smoke, the vapour from two different brands of e-Cigarette and fresh air. When exposed to cigarette smoke for six hours, the cells died. But after subjecting the cells to an “aggressive and continuous” dose of vapour, researchers claimed the damage to the airway tissue was similar to that of air. By employing a combination of a smoking robot and a lab-based test using respiratory tissue, it was possible to demonstrate.... the e-cigarette aerosols used in this study have no [toxic] effect on human airway tissue. There are now plans to carry out the same tests using the vapour from a wider variety of e-cigs, to prove its results. Dr Michael Siegel, professor in the department of community health sciences at Boston University’s school of public health, welcomed the latest study as evidence of the safety of electronic cigarettes. “Despite the limitations of the research, it adds additional evidence to support the contention that vaping is a lot safer than smoking,” he stated. He called on public health bodies and anti-tobacco groups to encourage smokers to swap to vaping - a step which would “transform the nicotine market and achieve a huge public health victory”.

6

support the successful deployment of that talent and research in driving innovation in enterprises and public services,” the minister said. As part of Innovation 2020 the Government aims to produce a success to the Programme for Research in Third Level Institutions. The programmes successor will include investment in the creation of new facilities and equipment. Taoiseach Enda Kenny says that the strategy sets out the Government’s intention to become a global innovation leader. “Our reputation for research excellence has been a major catalyst in our success in attracting and maintaining foreign direct investment, and this Strategy demonstrates that we remain strongly committed to maintaining and improving standards in the excellence of our research,” the Taoiseach said.

Bangers that battle colon cancer European Scientists are developing sausages that lower your risk of colon cancer The research being led by Eva Tornberg and her colleagues from Lund University in Sweden have been working with researchers from four other European institutions to produce the meat, with funding from the European Union. The project is of particular interest to Sweden, where colon cancer is one of the most common cancers. The idea is to imbue the meat with some of the virtues of fruits and vegetables. “If this hypothesis proves to be true, it will indicate that the risk of colon cancer can be reduced by eating a balanced diet – in other words, together with meat, eat lots of vegetables and other things that contain antioxidants,” says Eva Tornberg, a professor of food technology at Lund University. The World Health Organisation with its warning in October 2015 had laid much of the cancer-causing blame with the actual processing of meat into products like sausage and bacon, not just the meat itself. The stated that they had found it was the processing that “can result in the formation of carcinogenic chemicals.” The Swedish scientists think that by extracting antioxidants from plants and berries and then adding them to the meat, the meat will become not just safer, but preventative thanks to the health effects of antioxidants. The super-sausage development is in its early stages, but the next step is to test them on animals to see if they have the desired effect.

EU Research


Crowdfunding for researchers

Researchers in Australia have turned to crowdfunding to support lack of funds in research grants 86 per cent of applications for public research grants are unfunded. To combat this, the first crowdfunding site that specialises in medical research has been launched, in partnership with major research institutions. “I’m sure every Australian wants discoveries that will cure cancers, and prevent and treat problems like dementia, but they would be shocked to find out how many medical researchers with great potential are left without money each year,” says George Crones, the founder of ‘Researchable’, a new and unique online crowdfunding platform — the first of its kind in Australia. The scheme is designed to identify high quality research projects at major research institutions who are well trusted. Once selected Researchable do due diligence on them and only if then if they meet this criteria do they then use a very lean, cost-effective model

to pool funds from individuals, businesses and charities to achieve something greater together. “Most importantly, we don’t fund and forget. We stay engaged with the research institutions and ensure that the researchers report back to donors on how the project is advancing,” continued George Crones. “A major advantage for research institutions is that Researchable is a one stop shop for funding. We even assist with preparing the funding proposal.” If you donate to a project you’ll be able to follow the research as it happens, get regular updates and the final report at the end. If the research ends up being published, you’ll also get a copy of the research paper. “Donating to research is an investment in the future,” says Crones, “And I founded Researchable because I want to ensure that the breakthroughs of tomorrow receive the funding they need today.”

Scientists discover the secret to charisma Speed rather than intelligence is the key to appearing charismatic scientists discover According to a new study by researchers at the University of Queensland, an ability to think, and act, quickly plays a vital part in determining the level of charisma you’re perceived as having. “Our findings show that social intelligence is more than just knowing the right thing to do,” said head of the project Dr. William von Hippel. “Social intelligence also requires an ability to execute, and the quickness of our mind is an important component of that ability.” Participants in the study were assessed by their friends to determine how “charismatic” “funny,” or “quick-witted” they were. Researchers then measured their mental speed by giving then 30 general knowledge questions and asking them to answer as quickly as possible. A second study asked them to complete a number of timed pattern recognition

www.euresearcher.com

tasks. Scientists found that those who completed the mental speed tasks more quickly were perceived by their friends as having more charisma. Von Hippel said: “Although we expected mental speed to predict charisma, we thought that it would be less important than IQ. Instead, we found that how smart people were was less important than how quick they were. So knowing the right answer to a tough question appears to be less important than being able to consider a large number of social responses in a brief window of time.” Researchers suggested that mental speed did not predict other social skills, such as interpreting others’ feelings or handling conflict well. They also stated that mental speed may make it easier to mask inappropriate reactions and make humorous associations.

7


gives Research community Its Biggest-Ever Machine Learning Dataset

Yahoo announce that it’s making the largest-ever machine learning dataset available to the academic research community through its ongoing program, Yahoo Labs Webscope The new dataset measures a whopping 13.5 TB (uncompressed) in size, and consists of anonymized user interaction data. Specifically, it contains interactions from about 20 million users from February 2015 through May 2015, including those that took place on the Yahoo homepage, Yahoo News, Yahoo Sports, Yahoo Finance, and Yahoo Real Estate. “Data is the lifeblood of research in machine learning. However, access to truly large-scale datasets is a privilege that has been traditionally reserved for machine learning researchers and data scientists working at large companies – and out of reach for most

academic researchers” said Suju Rajan, Director of Personalization Science at Yahoo Labs. “Access to datasets of this size is essential to design and develop machine learning algorithms and technology that scales to truly ‘big’ data,” said Gert Lanckriet, professor, Department of Electrical and Computer Engineering, University of California, San Diego, in a statement. “At the Jacobs School of Engineering at UC San Diego, it will directly and significantly benefit the wide variety of ongoing research in machine learning, artificial intelligence, information retrieval, and big data applications.”

Breakthrough in Colitis Genetics suggest three IBD types ©NASA/JPL-Caltech/ASU

The largest genotype association study of inflammatory bowel disease to date showed the disease may be better explained by a spectrum of three main disease types — ileal Crohn’s, colonic Crohn’s and ulcerative colitis — rather than the current binary categorization. Researchers said this data provides a better understanding of IBD progression and could lead to more effective treatments. “This new research strongly suggests that we are dealing [with] a number of different diseases hidden within Crohn’s disease and ulcerative colitis, constituting a large spectrum of inflammatory bowel disease,” said Dermot McGovern, MD, PhD, MRCP(UK), director of translational medicine in the Widjaja Foundation Inflammatory Bowel and Immunobiology Research Institute at CedarsSinai. In a press release he added “We have very effective therapies for IBD if we use them sooner in the disease, especially for those patients who are at risk for developing a serious form of illness. We want to understand what the important, singular, genetic signature is for each individual patient because they may respond to

8

available therapies very differently, even with the same IBD diagnosis.” “For many of our patients, these new genetic insights could be very beneficial,” McGovern said in the press release. “But we also need to look more closely at some of the sickest IBD patients in hopes of providing more effective treatment and disease management.” Joanna Torres, MD, and Jean-Frédéric Colombel, MD, from the Henry D. Janowitz Division of Gastroenterology, Icahn School of Medicine at Mount Sinai, New York, wrote in an accompanying commentary. The study, “which focused mainly on genetic factors, provides only limited new phenotypic insights. No genetic loci associated with perianal disease, upper gastrointestinal disease, or extraintestinal manifestations were identified. No genetic predictors of disease progression in Crohn’s disease, or of disease extent or proximal extension in ulcerative colitis, could be detected.” They concluded that because the analysis was restricted to known IBD genetic variants, other important loci may remain unidentified.

EU Research


© European Union, 2015

Carlos Moedas announces European code on science misconduct to be updated

Moedas tells EU research ministers that Europe’s code on research integrity must change to reflect the new realities of collaborative research Research Commissioner Carlos Moedas has told EU research ministers that the Commission will update the code of conduct for researchers, so as to better discourage fraud and other misconduct and reflect the fact that an increasing amount of publicly-funded research is carried out beyond the walls of the university lab. At the same time, rules on ethics in Horizon 2020 will be “beefed up”, though no timetable was given for either change. The continent’s most widely recognised standard on research integrity¸ the European Code of Conduct on Research Integrity, dating back to 2011, was drawn up by The European Science Foundation together with the European Federation of National Academies of Sciences and Humanities (ALLEA). Now times have changed. “In a world of more open data and open access, research integrity is crucial. We need more responsibility – individual but also from the institutions,” Moedas said. Science misconduct comes in many forms and can involve inventing data, intentionally misrepresenting results, copying texts or ideas without referring to the original source, or the pursuit of a compelling story that ignores contradictory evidence. The existing rules were written mostly with academics in mind, said Maura Hiney, head of research policy and evaluation at the

www.euresearcher.com

Irish Health Research Board. “I can see where the Commission would like to expand on today’s codes,” she said. Competitive grants these days mostly involve multiple partners, are crosssector and often involve industry. Separately, Moedas reported on the progress of the Commission’s EU-wide RESAVER pension scheme which aims to make it easier for researchers to relocate from one country to another. The scheme is expected to go into operation next spring, and to date 280 institutions from nine EU member states have signed up for it, Moedas ssaid. The voluntary scheme will not substitute for state-run pension systems, also known as first pillar pensions, but will provide supplementary benefits financed through employer contributions and private pension plans for individuals, so-called second and third pillar pensions. Moedas also announced that the Commission’s Science4Refugees initiative, announced in the beginning of October with the aim of pointing refugees towards job or training opportunities in universities and labs across Europe, has to date resulted in 18 registrations. Academic institutions have posted 219 vacancies on EURAXESS, the EU jobsite for research positions, using Science4Refugees branding, he said.

9


Beyond DNA sequencing DNA sequencing is established as a leading technique for genomic analysis, but there are several limitations which prevent it from extracting the full range of information in the genome. Dr Yuval Ebenstein tells us about the BeadsOnString project’s work in developing a toolbox to reveal genetic and epigenetic variation in genomic DNA and chromatin A very powerful

genomic analysis technology, DNA sequencing is widely used in biological research, giving scientists access to a wealth of genetic and epigenetic information. However, it has several limitations that prevent it from extracting all the information that can be found in the genome, says Dr Yuval Ebenstein, the Principal Investigator of the BeadsOnString project. “Firstly DNA sequencing is an averaging technique. You take a large population, and you sequence on the population level. So you get an average sequence – which means that if there is an interesting but small sub-population within your population, you’re blind to it,” he explains. The project aims to develop an experimental toolbox to uncover genetic and epigenetic variations in genomic DNA and chromatin, which is difficult with existing techniques. “It’s very hard with DNA sequencing to detect very small sub-populations, such as a small fraction of diseased cells, or cancer cells, that are at a very early stage of development,” continues Dr Ebenstein. “Another problem is that sequencing extracts information from very short molecules, what’s called the read length. So you can only read a relatively small string of information from DNA – a few hundred letters. You don’t have

10

information on how genomic observables are located with respect to each other along an individual genome.” The third limitation that Dr Ebenstein and his colleagues aim to address is the fact that researchers currently need to amplify the DNA by using a polymerase chain reaction (PCR) when sequencing. PCR acts almost as a photocopier of DNA, and the technique is used extensively in

to consider. “It turns out that beyond this code, which is termed the genetic code of identity of the bases, there’s more information,” he outlines. “There are all kinds of chemical modifications to the DNA, which are like an annotation to the DNA – and they change the function of these DNA bases, without changing their identity. These epigenetic marks are basically erased during the PCR process.”

The idea would be to stretch this DNA, which is now decorated with dots of different colours. These dots are basically fluorescent molecules that are attached to specific features of the DNA and can be used to identify the DNA and its genomic location biology to create copies of DNA. “You can take a limited amount of sample and amplify it and make millions of copies, so that you have enough DNA to run your sequencing experiment. The problem is that during this PCR copying process some of the information is lost,” explains Dr Ebenstein. Our DNA is basically comprised of a long chain of four DNA letters – A, G, T and C – and the different combinations of these four letters make up the genetic code; however, Dr Ebenstein says there’s also some further information

Biochemical toolbox The project aims to address these problems by basically unravelling chromosomes and looking at DNA effectively as a very long piece of string, expanding the range of information that can be measured simultaneously. Researchers are developing a biochemical toolbox which can be used to highlight different kinds of information on DNA with different colours. “The idea would be to stretch this DNA, which is now decorated with dots of different colours.

EU Research


These dots are basically fluorescent molecules that are attached to specific features of the DNA and can be used to identify the DNA, its genomic location, and to uncover all the overlaying epigenetic information,” says Dr Ebenstein. This technique could, for example, enable researchers to identify the specific location of epigenetic marks or DNA damage. “We’ve developed a way to fluorescently mark DNA damage. It’s very interesting to know if your DNA is damaged at specific places when you’re exposed to the sun,” continues Dr Ebenstein. “One of the things we’re trying to do is to taint these damaged sites with a specific colour, and when we stretch the DNA and map it, we want to see if we have accumulations of damage at specific locations. So if a cancer patient who is taking a drug, and one of the side-effects is UV sensitivity, then what happens to these cells if they’re exposed to the sun?” This kind of information is not easily accessible with existing sequencing technologies, as they work by looking at the population level. Dr Ebenstein and his colleagues aim to develop a toolbox that allows researchers to see the properties of individual DNA molecules. “We’re sensitive to a lot of information that’s not easily accessible with DNA sequencing. For example, with a single snapshot on the microscope, we can see all the different kinds of information represented with different colours on a DNA molecule,” he explains. Researchers are developing chemoenzymatic approaches to label these different types of information with a different chemical moiety, combining chemistry with enzymology. “Enzymes are natural molecular machines, that have a lot of functions in biology. There are many types of enzymes, and usually they perform a specific task,” outlines Dr Ebenstein. “For example, there is a family

www.euresearcher.com

of enzymes called restriction enzymes, which are molecular scissors. They know how to read a certain word on DNA, and cut the DNA at that word. These restriction enzymes evolved to defend bacteria from invading viruses. Once the DNA from the viral genome goes into the bacteria, these molecular scissors identify it, and they cut it up.” The project is using such enzymes in a different way, using their ability to perform sequence specific chemistry on DNA molecules. DNA is a double-stranded molecule, so to cut it you have to cut both strands. These enzymes have been mutated and instead of cutting DNA into two pieces they only cut one strand, leaving a nick. “So these enzymes know how to read a specific word of DNA, and to cut one strand of it. Now, we introduce another enzyme, DNA polymerase. This enzyme repairs DNA – when it sees the nick, it grabs a base from solution and it heals the nick. It introduces it into this gap and fixes the DNA,” explains Dr Ebenstein. Researchers then effectively trick this enzyme, giving it a nucleotide, a base, with a fluorescent molecule attached. “This enzyme is now actually doing the chemistry for us – it’s introducing a fluorescent molecule, a light-emitting molecule, into the exact word where the nicking enzyme has made its nick,” continues Dr Ebenstein. “This system has evolved naturally for millions of years, and is very efficient and very specific. We take this system out of its natural context and use it for a biotechnological application.” Dr. Ebenstein and his team utilize this concept of manipulating the activity of naturally occurring enzymes for labelling various epigenetic marks in the genome, thus expanding the available toolbox for single-molecule genomic barcoding and mapping.

At a glance Full Project Title Experimental Toolbox for Unmasking Genetic / Epigenetic Variation in Genomic DNA and Chromatin (Beads on String Genomics) Project Objectives The ground-breaking goal of this research project is to establish a robust experimental toolbox – ‘beads-onstring’ -for integrated genetic/epigenetic profiling of native chromosomes. A successful accomplishment of this goal will allow the characterization of genomic variation otherwise hidden by ensemble averaging and will open new horizons in genomic research and personalized medicine. Project Funding 1,627,600 euros Contact Details Principle investigator, Dr Yuval Ebenstein School of chemistry Tel Aviv University Israel T: +972-3-6408901 E: uv@post.tau.ac.il W: nanobiophotonix.com

Dr Yuval Ebenstein

Yuval Ebenstein studied chemistry and physics at the Hebrew University in Jerusalem, Israel, where he also completed his Ph.D. in physical chemistry with Prof. Uri Banin, studying the photophysical properties of individual semiconductor nanocrystal quantum dots (QDs). He then moved to work as a postdoc with Prof. Shimon Weiss at UCLA where he used QDs to light-up individual DNA binding proteins and map them along bacteriophage genomes. In the summer of 2011 he set-up the NanoBioPhotonix Lab in the department of chemical physics, school of chemistry at Tel Aviv University.

11


BeyondSeq Project This work holds significant implications for research into the underlying causes of disease. The technology being developed within BeadsOnString will also be used in the BeyondSeq project, an EC-funded initiative that Dr Ebenstein is coordinating. Researchers from seven groups around Europe aim to use the concept of looking at individual DNA molecules in order to improve the diagnosis of specific conditions, including several types of cancer, rare genetic disease and cases of infection by antibiotic resistant bacteria. “The objective is to develop new technologies to provide complementary solutions to sequencing and thus analyze the hidden dimension of genetic mutations,” says Dr Ebenstein. Emerging

At a glance Full Project Title Genomic diagnostics beyond the sequence (BeyondSeq) Project Objectives The goal of the BeyondSeq project is to bridge the technological gap between cytogenetic diagnostics, which can identify chromosomal aberrations, and next generation sequencing (NGS), which can detect single base-pair mutations. The mission of the participants in this project will be to develop a set of tools, from systems for extracting long DNA molecules and preparing samples through to analysis software to interpret genetic information. Project Funding 6 million euros Project Partners Tel Aviv University, Israel · Technion – Israel Institute of Technology, Israel · Chalmers University of Technology, Sweden · Lund University, Sweden · University of Leuven, Belgium · The University of Birmingham, United Kingdom · Genomic Vision, France · Impasara Limited, United Kingdom Contact Details Principle investigator, Dr Yuval Ebenstein School of chemistry Tel Aviv University Israel T: +972-3-6408901 E: uv@post.tau.ac.il W: www.beyondseq.eu

Dr Yuval Ebenstein

12

optical DNA mapping technologies will be used to analyse long, individual DNA molecules, with the aim of providing medically relevant genomic information. “The project will establish a robust platform and workflow for integrated genetic and epigenetic profiling of single DNA molecules. We will also develop a set of specific diagnostic assays based on optical mapping of individual DNA molecules,” outlines Dr Ebenstein. “We plan to look towards early stage commercialization of reagents, prototype DNA barcoding devices and data analysis software based on the outcome of proof of principle demonstrations.” The ultimate goal of the project is to translate recent scientific breakthroughs

in optical genome mapping into diagnostic applications. This work could eventually have a significant impact on healthcare, helping improve diagnosis of several different conditions. “The methods and tools developed in the BeyondSeq project will seed a wide range of novel, genebased diagnostic platforms, all based on single-molecule analyses. We expect that most of the technologies developed in this framework will be directly translatable via new or existing ventures, and will have an immediate impact on healthcare,” says Dr Ebenstein. “They will provide faster, lower-cost and broadly accessible molecular diagnostic platforms for early detection of cancers, infections and other diseases.”

BeyondSeq Focus The main focus of Dr Ebenstein’s group is single-molecule genomics, but they are also working to develop new optical detection schemes and novel imaging techniques. The group explores genomes, utilizing tools and reagents from the realm of nanotechnology, aiming to learn new things about these systems by zooming in on individual elements – single cells, single chromosomes and single molecules. Research in the laboratory is highly multi- and inter-disciplinary and the team is comprised of chemists, biologists and physicists who are interested in learning from each other and doing some great work at the very forefront of science. The laboratory specializes in many areas of optical imaging and spectroscopy, with a particular emphasis on single molecule detection and the development of imaging based techniques. The research is focused on the application of novel imaging and optical detection approaches to genomic studies and biomarker detection. Researchers are developing new spectroscopy and microscopy methodologies that combine advanced optics with tools and reagents from the realm of nanotechnology. In addition, the group is deeply interested in developing unique biochemistries for genomic analysis that are based on chemoenzymatic reactions.

EU Research


New insights into aging The loss of muscle strength as we age has a significant impact on health, leading to functional limitations and increased risk of mortality. Dr Brian Clark tells us about his research into the causes of muscle weakness in the elderly, and how this could underpin the development of interventions to enhance healthy aging The process of

aging is currently the focus of a great deal of research attention, as healthcare authorities seek to adapt to the challenges posed by our aging population. Based at the Ohio Musculoskeletal and Neurological Institute (OMNI), Dr Brian Clark is leading research into the neural mechanisms of dynapenia, a condition which is broadly defined as the age-related loss of muscle strength and power. “Our research involves looking at the causes of muscle weakness and physical limitations in seniors – particularly focusing on the nervous system,” he outlines. The vast majority of previous scientific investigations into age-related muscle weakness have focused on muscle tissue, both in terms of overall size and the rate at which it’s shrinking; while Dr Clark acknowledges that this is an important factor, he believes there are also other issues to consider. “Muscle size doesn’t explain the whole story by any means. So there are a lot of other physiological causes of muscle weakness and physical limitations in older adults,” he says.

www.euresearcher.com

Nervous system Research by Dr Clark and his colleagues over the past 10-15 years has consistently pointed to changes in the nervous system, the brain and the spinal cord as being major factors. The aim now is to perform non-invasive tests on elderly people with different strength characteristics to gain further insights into the causes of muscle weakness. “We’ll measure muscle strength in a variety of different ways, and we’ll also use imaging technologies – specifically DEXA scans, or dual energy X-rays – to quantify how much muscle tissue an individual has. So we can express their muscle strength relative to their amount of muscle tissue,” outlines Dr Clark. Several other parameters will also be investigated. “We’ll look to see whether or not differences are apparent between the stronger people – the stronger seniors relative to their muscle mass – and weaker individuals when it comes to several neurophysiological parameters,” continues Dr Clark. “So, we’ll measure excitability of the brain using transcranial magnetic

stimulation, we will also modulate the brain’s excitability using transcranial direct current stimulation. Additionally, we will obtain MRI’s of the brain and also look at the way the motor nerves branching off of the spinal cord discharge by using a new technique called decomposition electromyography.” One hypothesis put forward to explain muscle weakness in older people with dynapenia is that they have a reduced ability to voluntarily activate skeletal muscle, that is to activate their nervous system in such a way as to tell their muscles to act optimally, due to higher levels of intracortical inhibition. The question of whether an individual can optimally activate their muscles can be investigated in a laboratory setting. “We take an individual’s leg, let’s say, then hook it up to something that can measure its force, and then we secure them in this device. We put a couple of big, rubberized pads on different parts of their leg, and then we put a very brief electrical stimulus into the two big signaling pads,” explains

13


Dr Clark. Researchers then follow a clear set of steps to assess how the nervous system is activated. “What we’ll then do is we’ll ask the individual to contract their leg muscles as hard as they can, so we’ll say: ‘ok , on the count of 3 I want you to ease into this for the first second, then I want you to push out against this thing as hard as you can.’ And we’ll give them some verbal encouragement, and they’ll push and contract their muscles as hard as they can,” outlines Dr Clark. At this point researchers stimulate the muscle again and then ask the individual to relax, before stimulating it again around one or two seconds later. This enables researchers to gain important information about the activation of the nervous system. “The question we’ll ask is – when we stimulated your muscles while you were contracting, did the amount of force they were producing go up? If it did, it tells us that you were not able to optimally activate the nervous system, that there was more in the reserve than you were able to activate,” says Dr Clark. This can be expressed as a percentage of the amount of force that was delivered for that same stimulus after the individual relaxed. “Somebody may be able to activate their muscle at, say, 90 percent of their maximum. So we’ll say they’ve got 90 percent voluntary activation,” continues Dr Clark. “Sometimes people will reverse that and say; ‘ok, if somebody

14

can activate it at 90 percent, that means 10 percent is inactive’. So occasionally you’ll see somebody refer to it as voluntary inactivation.” The loss of voluntary muscle strength has a signifcant impact on health, leading to an increase in functional limitations and mortality, which researchers aim to address through the development of

Clark. “Now, most people respond to some extent; rarely do we see a true ‘nonresponder’. Then, of course, the question is; why is it that certain people are responding better than others?”

Interventions There are a number of issues to consider here, including the genetic background

Elderly muscle actually responds phenomenally well to exercise. A few studies have even shown that older adults’ muscles respond better to resistance exercise training and other exercise programs than young adults interventional strategies to enhance healthy aging. Exercise training is one intervention which has been shown to be effective in the elderly. “Elderly muscle actually responds phenomenally well to exercise. A few studies have even shown that older adults’ muscles respond better to resistance exercise training and other exercise programs than young adults,” stresses Dr Clark. The effectiveness of these types of interventions can of course vary according to the lifestyle of the individual. “If we take 100 older adults, and ask them do a standardized exercise program, you’re going to see some people who respond very well to it, and some people that don’t respond as well,” says Dr

of the individual and various other lifestyle factors. Dr Clark and his colleagues aim to take these different factors into account in the development of effective interventions. “We’re looking for a solution that’s multi-factorial, in that it can alter all the different things that affect it. Exercise training is probably one of the keys – we’ll have a difficult time treating dynapenia without involving exercise,” he acknowledges. A number of sub-studies are being run in the project to gather data on the impact of specific interventions or modes of behaviour on muscle strength. “One of these sub-studies will involve asking people to undergo twelve weeks of

EU Research


progressive resistance exercise training. We think that this not only increases muscle tissue size, but it also seems to have a very strong and profound effect on the nervous system. So that will help us tease out some important mechanisms, as well as investigate the potential of exercise training as a solution,” says Dr Clark. A second sub-study will be run in which people will be randomly assigned to either a control group, that is told to just continue normal life for six weeks before returning for tests, or asked to do what Dr Clark calls mental imagery training for the same period. This is more of a mind-body type of intervention. “It doesn’t involve exercise per se, rather it involves people thinking about doing activities. They actually try to urge themselves on and to feel themselves doing that particular activity,” says Dr Clark. In the future, the researchers also plan to investigate combining different types of interventions. “At some point, we will look at the effects of combining exercise with nutritional supplementation, or pharmacological supplementation,” continues Dr Clark. “What are the best kinds of exercise? Is it critical for the exercise to involve really high levels of intensity? Or is moderate intensity better because that’s more tolerable, and people don’t put their joints under as much stress? These are some of the specific things that we’re still trying to figure out.”

These questions will form an important part of Dr Clark’s future research agenda, while studies into the role of the brain and the spinal cord in dynapenia are ongoing. In their next study, titled The UNCODE Study (Understanding the Neural Contributors of Dynapenia in Elders), Dr Clark and his colleagues will aim to identify what mechanisms are involved in muscle weakness at the level of the brain and the spinal cord, which could lead to the development of more effective interventions. “We aim to really pinpoint where the causes of the weakness are coming in, as well as the interactions between that and some cognitive and psychological issues. Once we know that, I think we’ll be at a stage where we can really start to look at optimizing strategies and interventions by testing them in clinical trials, to determine whether or not they’re effective in promoting healthy aging,” he outlines. New findings will drive the next stage of development, and researchers are exploring a variety of ideas to enhance healthy aging. “We’ve looking at utilizing mobile phone applications to enhance motor function with aging, while we’re also investigating whether or not we can use non-invasive brain stimulation techniques to enhance exercise capacity, and/or utilize certain pharmacological compounds or supplements to serve as an adjunct to exercise,” says Dr Clark.

At a glance Full Project Title Unraveling the Neural Contributors Of Dynapenia in Elders (The UNCODE Study) Project Objectives The UNCODE Study will better identify the neurological causes of muscle weakness associated with aging. Additionally, it will examine the potential for different interventional strategies to enhance muscle. Project Funding This work was supported in part by the following grants from the National Institutes of Health (NIH) to BC Clark: R01AG044424 from the National Institute on Aging, R01AT006978 from the National Center for Complimentary and Integrative Health, and R21AR063909 from the National Institute for Arthritis and Musculoskeletal and Skin Diseases. Contact Details Brian C. Clark, Ph.D. Ohio University OMNI and Dept. of Biomedical Sciences 250 Irvine Hall Athens, Ohio 45701 T: +1 740 593 2354 E: clarkb2@ohio.edu W: www.ohio.edu

Dr Brian Clark

Project Coordinator

Dr Brian Clark is the Executive Director of the Ohio Musculoskeletal and Neurological Institute. His overarching research goal is to develop effective and implementable interventions that increase muscle function. He is a Fellow of the American College of Sports Medicine and a standing member of the National Institute of Health’s (NIH) Motor, Function, Speech and Rehabilitation (MFSR) study section.

www.euresearcher.com

15


Neural stem cells maintain the capacity to divide and generate differentiated neurons in the adult brain, helping the central nervous system to adapt to new challenges, yet their abundance declines as we age. We spoke to Dr Laure Bally-Cuif about the SYSTEMATICS project’s research into the maintenance and mobilisation of neural stem cells

Unravelling the secrets of stem cell maintenance The central nervous system is largely developed during the early stages of life, with progenitor cells dividing to form neurons and glial cells. Neural stem cells continue to play an important role in the adaptation and growth of the central nervous system during adulthood, an area which forms the primary research focus of the Systematics project. “The general aim of the project is to try and understand which molecular and cellular mechanisms control the formation, maintenance and recruitment of neural stem cells in the adult brain in vertebrates,” says Dr Laure Bally-Cuif, the project’s Principal Investigator. These neural stem cells maintain their proliferation capacity and are capable of generating neurons and glial cells, right through into adulthood. “It was believed until 15-20 years ago that no such cells were maintained in the adult mammalian brain. Then it was found that this was not true. There are such cells in the adult brain in all vertebrate species that have been looked at so far,” explains Dr Bally-Cuif. “It was shown that the neurons that are formed from these cells in adult mammals are very important, because these neurons are more plastic than neurons that are born during development.” These neurons are used to adjust brain function, for example by forming new memories in various different contexts. While a number of these neural stem cells are maintained in the mammalian brain, researchers have found that they are gradually lost or silenced throughout life, an area of great interest to Dr Bally-Cuif. “The mechanisms responsible for the loss or silencing are very poorly understood,” she says. The project is using the zebrafish as a model system to investigate the mechanisms involved in stem cell maintenance. “Mammalian embryos have, proportionally to their size, roughly the

16

same number of progenitor cells – that can generate neurons throughout life – as fish embryos. But whereas mammals lose or silence them, bony fish maintain many of them in an active state over the long term. So these fish, such as the zebrafish, are very good model systems to try and understand the mechanisms that control the maintenance of these stem cells,” continues Dr Bally-Cuif. “We will use the zebrafish to try and figure out the molecular and cellular mechanisms that maintain the stem cells in the brain of the fish. The idea will be to use this as a framework to understand what goes wrong in either physiological conditions where stem cells are lost, like aging, or in cases of diseases where the stem cell pools are altered. This is the case in cancer for example, or in some mood disorders, pathological conditions that can also be efficiently mimicked in fish.”

Adult zebrafish (size: 3 cm)

Progenitor cells This process of losing active stem cells begins very soon after birth in mammals. Most of the progenitor cells that are used to make neurons and build the mammalian brain are active during embryogenesis, but this number drops dramatically after birth; those active stem cells that remain are maintained in two main niches in the adult brain. “In rodents, one niche is involved in making neurons that are used

EU Research


Sara Ortica-Gatti, engineer in molecular biology, generating transgenic fish lines.

in the olfactory bulb, for learning new odor-linked information. The other is found in the hippocampus – a brain area involved in spatial memory. The latter niche is active in man as well. So these neurons are used to make or retrieve new memories,” explains Dr Bally-Cuif. The number of neural stem cells that are maintained in an active state decreases dramatically as humans age, which it has been suggested could be a factor in memory loss; Dr Bally-Cuif says the pattern is very different in fish. “There’s no such drop in the number of neural stem cells in fish after hatching, and large numbers of active neural stem cells are maintained virtually everywhere in the brain until adulthood,” she outlines. “An adult zebrafish can live for up to three years in the wild, and usually up to two years in the lab. They are sexually mature at three months of age, and the number of stem cells is roughly maintained between the ages of three and six months.”

layers of the brain before you reached the inside of the tube and the stem cells, whereas in zebrafish the stem cell layer is immediately below the skull. So we make use of this in the project.” Researchers can image directly onto the brain, and the first cell layer they encounter is the stem cell layer. The behaviour of these stem cells can be imaged directly in their endogenous niche, without having to open the skull, which Dr Bally-Cuif says represents a major step forward. “This is really the first time that it’s been possible to image stem cells in their completely normal environment in a vertebrate,” she says. The project is using this approach to gain insights into how stem cell pools are maintained, particularly in terms of quiescence, a state in which cells will not divide, but may be stimulated to do so later. “One of the key parameters of stem cell maintenance that we try and understand genetically is quiescence,” continues Dr Bally-Cuif. “It’s

Whereas mammals

lose or silence stem cells, bony fish maintain many of them in an active state over the long term. So the zebrafish is a very good model system to try and understand the mechanisms that control the maintenance of these stem cells

Zebrafish facility at the NeuroPSI institute. Built by Schwarz Aquarienbau, Göttingen, Germany)

www.euresearcher.com

From this point the number of stem cells decreases and the zebrafish starts aging in a similar way to mammals, making them an effective model to study changes in neural stem cell maintenance during aging. Another key advantage is the fact that the germinal zone in the adult zebrafish forebrain – the location where neural stem cells are found – is located superficially underneath the skull. “The central nervous system is organised as a kind of hollow structure, like a tube, with fluid circulating inside. The ventricular zone of the central nervous system, where stem cells are located, is the row of cells that is in contact with this fluid,” outlines Dr Bally-Cuif. In zebrafish this tube opens dorsally and flips over, so that the ventricular zone becomes superficially located, which means the germinal zone is then directly accessible for imaging. “This is impossible in mammals, because the germinal zone is a very deep internal structure in the brain,” explains Dr BallyCuif. “If you came from the top of a mouse’s head with a microscope, you would need to go through all the neuronal

been shown in some other systems that the frequency of stem cell activation – the number of times that they get out of quiescence to divide – conditions their lifespan. Stem cells don’t seem to have an infinite capacity to activate, and quiescence is a way of regulating this. So if you activate a stem cell too much you will lose it earlier. That’s what people believe is happening during aging – the stem cells have divided the number of times they were supposed to divide and so they stop. But how this is counted and regulated is poorly understood”

Stem cell maintenance Maintaining quiescence and controlling the frequency of stem cell activation is therefore one of the key parameters of stem cell maintenance. The project is studying this in depth, and researchers have identified at least two genetic pathways over the last couple of years that are involved in controlling quiescence. “One is called Notch signalling, which is a ligand-receptor interaction pathway that usually occurs between two neighbouring

17


At a glance Full Project Title Dynamics and Homeostasis of Germinal Zones in the Adult Vertebrate Brain (Systematics) Project Objectives The SyStematics projects uses the teleost fish zebrafish to gain insight into the molecular, cellular and population mechanisms maintaining pools of active neural stem cells in the adult brain. It also aims to understand how these processes are perturbed in some pathological or altered physiological contexts where the homeostasis of adult neural stem pools is altered.. Project Funding E 2,499,855 Project Partners • E mmanuel Beaurepaire, Ecole Polytechnique, Palaiseau • J ean Livet, Institut de la Vision, Paris •B enjamin Simons, University of Cambridge, Cavendish Laboratory, UK Contact Details Laure Bally-Cuif Paris-Saclay Institute for Neuroscience (Neuro-PSI) UMR 9197 CNRS - Université Paris-Sud Avenue de la Terrasse, Bldg 5 F-91190 Gif-sur-Yvette T: +33 1 6982 4276 E: bally-cuif@inaf.cnrs-gif.fr W: http://neuro-psi.cnrs.fr/spip.php?article145 Radial glia and neural progenitors in the adult zebrafish central nervous system. Than-Trong E, Bally-Cuif L. Glia. 2015 May 14. doi: 10.1002/glia.22856.

Laure Bally-Cuif

cells. A cell that has a high level of Notch signalling will be quiescent, while if notch signalling is decreased the stem cell will activate,” explains Dr Bally-Cuif. The second genetic pathway is still the subject of active research, but Dr Bally-Cuif says the data on Notch signalling is very robust, and researchers continue to investigate its genetic basis. “There are several Notch receptors – we have identified one that is especially important for the maintenance of quiescence, which is Notch3,” continues Dr Bally-Cuif. “We are looking at this at two levels. The first is the single-cell level – so we are trying to understand how high Notch signalling controls quiescence. Notch is a receptor – it’s located at the cell membrane. When Notch signalling is activated it is cleaved from the membrane, moves inside the cell nucleus, and controls expression of target genes, which, in the context of stem cell quiescence, remain to be fully identified.” The project is also looking at stem cells at the population level, which Dr BallyCuif says is a highly original aspect of their research. This involves – in collaboration with a lab at Ecole Polytechnique (E. Beaurepaire) – developing novel optical tools to image the behaviour of neural stem cell populations in situ. Stem cells are considered here not as single cells, but as cells in a population, and researchers are looking at whether the behaviour of one particular cell can influence that of its neighbour. “We believe that all cells have the capacity to be quiescent or to activate, and that the state of the neighbouring cells will influence that,” explains Dr Bally-Cuif. In a germinal zone or sheet containing 1,000 stem cells, of which 50 are activated and 950 quiescent, Dr BallyCuif says the 50 activated stem cells will

be widely distributed. “You never find 2 that are next to each other. But then if you looked at the germinal zone two days later, you would also find 50 neural stem cells that are activated, but they wouldn’t be the same,” she says. “So we believe there is some regulation that occurs at the level of the population - ie. some emergent properties that will control where those activation events will be positioned within the stem cell sheet.” Some scientists are exploring the possibility of stimulating stem cells to produce neurons, as a means of replacing those lost following stroke or neurodegenerative disease. However, Dr Bally-Cuif believes this remains a distant prospect, and that more needs to be learned about the fundamental mechanisms of stem cell maintenance, recruitment and fate before endogenous stem cells can be safely manipulated. “Our project aims to understand what controls quiescence maintenance versus activation,” she says. This is very much a long-term aim, but the project’s research will also have more of an immediate impact in enabling the testing of molecules that stimulate stem cells. “If a company has small molecules that they see are capable of activating stem cells in vitro, the zebrafish could be a very good model to test them and see whether they work in vivo. They could identify whether the zebrafish develops tumours, and if the neurons that are produced properly integrate into the brain, and zebrafish neural stem cells are very similar to mammalian ones” outlines Dr Bally-Cuif. “Our work will lead to the establishment of a platform where that kind of molecule can be tested, at least as a first approximation of their efficacy and potential danger in vivo.”

Laure Bally-Cuif obtained her PhD in developmental biology and neuroscience at University Paris 6 in 1995. Following postdoctoral training at Princeton University (USA) she was recruited as a principal investigator at the Helmholtz Research Centre in Munich in 2000. She moved her research lab to the CNRS in Gif-sur-Yvette in 2010 on a CNRS Research Director position.

Nicolas Dray, CNRS staff scientist, analyzing an adult brain section at the confocal microscope.

18

EU Research


The rising incidence of dementia is a major healthcare challenge, leading to an intense research focus on improved diagnosis and treatment. The ASL in Dementia project aims to develop a cost effective tool to measure the amount of blood delivered to specific areas of the brain, which is an important indicator of dementia, as Professor Xavier Golay explains

AVID Study: ASL average CBF maps for disease groups

T1

A clearer picture of dementia

Parcellation

CBF

Posterior Cerebral Atrophy The increasing incidence of dementia is both a major healthcare challenge and a significant financial burden, with the costs of care approaching 1 percent of global GDP. Early diagnosis of the condition will be essential to reducing the costs of care and enabling more tailored treatment, an issue being addressed by the ASL in Dementia project. “The aim of the project is to establish a new MRI technique called Arterial Spin Labelling (ASL), which is a method to measure brain perfusion,” says Professor Xavier Golay, the project’s scientific coordinator. Brain perfusion is the rate at which blood is delivered to tissue; low rates in particular areas of the brain are an indicator of dementia. “If one part of the brain is not functioning because of dementia, it will

www.euresearcher.com

Controls

have reduced brain activity. Because it has reduced brain activity, it will need less nutrients, such as oxygen and glucose, as those cells don’t need the energy provided by these nutrients,” explains Professor Golay. “While the brain accounts for only about 1-2 percent of the overall weight of the body, it gets 40 percent of the cardiac ejection fraction, so blood flow to the brain is very high. This is because it relies primarily on aerobic glucose metabolism – together with the heart, the brain has the highest metabolism in the body.” The baseline level of activity in the brain is typically very high, but blood flow to specific areas of the brain will decrease in an individual suffering from a neuro-degenerative disease. Professor Golay and his colleagues in the ASL

network aim to use this as a means of diagnosing dementia, and potentially other neuro-degenerative diseases. “We propose using the correlation between glucose metabolism and perfusion – which is brain perfusion, or the amount of blood that is delivered to the tissue – as an alternative biomarker in dementia. This would allow to not only diagnose the disease, but also eventually to provide early markers of response to therapy,” he outlines. The project aims to develop the ASL technique, a non-invasive method of obtaining reproducible brain perfusion measurements. “There are four main objectives in the project. The first is to harmonise, shortlist and develop the best possible methodologies on the image acquisition in MRI. The second aim is to

19


improve image processing techniques in order to maximise the signal acquired in patients in only a few minutes through the acquisition of a series of images,” continues Professor Golay. “These images can be transformed into quantitative values through biophysical modelling, giving an exact measurement of brain perfusion.”

Quantitative values This is achieved through image processing software designed to offer the best possible levels of precision and reproducibility, which is paramount in the diagnosis of dementia. High-resolution MRI may be used to detect shrinking in the global grey matter in the brain in the order of a few percentage points for example, a typical value for patients with Alzheimer’s disease. However, when looking at physiological signals, the typical signalto-noise ratio (SNR) is an order of magnitude lower, and so high levels of accuracy are required. “You are looking for very small signals, very small measures. Therefore it is very important to assess the reproducibility and precision of the machine,” says Professor Golay. The problem here has been that MRI machines have historically worked almost as giant snapshot cameras; through the work of Professor Golay and others, they are now being transformed into scientific instruments. “A camera gives you a picture, and for many years the role of radiologists has been to look at pictures,

20

and analyse patterns in them to infer what disease a certain patient is suffering from,” he outlines. “If you want to measure a quantitative parameter like cerebral blood flow, which has a proper physical value, then you don’t need a camera, you need a scientific instrument. A scientific instrument needs to provide more than a camera. It needs to always give you the same answer for the same input.” There are two basic ASL techniques, Continuous ASL (CASL) and Pulsed ASL (PASL), with the measurement obtained

ASL Perfusion images of healthy controls (first column), Mild Cognitively impaired patients (MCI: 2nd column), and Alzheimer’s Disease patients (AD: 3rd column). Note the progressive loss of perfusion in the frontal and parietal parts of the brain.

in both cases by labelling arterial blood water spins proximal to the tissue of interest. Many studies have been published on whether the measurements are reproducible; the general conclusion is that they are indeed reproducible, although there is a relatively high level of variability due to various physiological factors. “If you had a lot to drink last night then it is likely that your brain will not have the same level of blood flow as if you had had a herbal tea before going to bed and then having a good night’s rest. Less dramatic changes can be seen between 9am when you arrive at work, and 5pm when you leave,” explains Professor Golay. The main areas of the brain affected by dementia will show dramatic changes well beyond these daily modulations, yet Professor Golay says it is still important to be careful when measuring brain perfusion. “You need to assess how much the daily variation can affect your measurements, because it is important not to overdiagnose,” he stresses. The incidence of neurodegenerative disease and dementia is rising though, a context in which early and accurate diagnosis takes on even greater importance. Mapping the amount of blood that is delivered to tissue in the brain can also provide more information about the type of dementia; Professor Golay points to a specific example. “Sometimes, clinically it is difficult to differentiate between frontotemporal dementia (FTD)

EU Research


and Alzheimer’s disease,” he says. One of the clearest differences between these patients would be the pattern of hypoperfusion. “This pattern represents the different parts of the brain that no longer function properly, it is different between these two dementias,” explains Professor Golay. “The clinical manifestations of these two types of dementia may be very similar. However, if you are looking at a pattern of blood flow to the different parts of the brain, you will see that blood flow to the frontal lobe and parts of the temporal lobe is significantly reduced in FTD, while the temporal and parietal areas are more affected in Alzheimer’s disease. So the differences in these patterns would enable a better diagnosis.” This approach could also be used to diagnose other conditions aside from dementia and neurodegenerative disease. A good example would be cases of stroke, where a major blood vessel in the brain is blocked, preventing blood from reaching the tissue. “In the case of stroke, the reduction in blood flow to specific areas of tissue is not related to a lower brain activity in a specific area, but because the blood cannot arrive there. So any method that allows you to map the amount of blood reaching the tissue would be very useful – you can see which part of the brain is affected by lack of perfusion, because one of the vessels has been permanently, or semi-permanently, blocked,” outlines Professor Golay. ASL could also help identify parts of the brain involved in epilepsy. “The seizure onset zone is a piece of tissue that first triggers an epileptic seizure. If an individual is inbetween two seizures then clinicians will be able to see that the local perfusion reduced in that part of the brain,” explains Professor Golay. “So you can pick up a focal change, a reduction in perfusion, which might indicate that this part of tissue will trigger epilepsy.”

Gold standard

At a glance

The focus for the moment remains on dementia though, and the project aims to establish ASL as the standard clinical tool in the diagnosis of the condition. The final obstacle to standardised clinical and research applications of ASL is the different commercial implementations applied by the major MRI vendors. “When we started four years ago there were hundreds of different alternatives and implementations of this ASL method out there in the field. This was very confusing for both radiologists and neurologists,” outlines Professor Golay.

Full Project Title Arterial Spin Labelling in Dementia COST Action

We propose using the correlation to glucose metabolism as an

alternative biomarker in dementia. This would

allow us to not only diagnose the disease, but eventually to provide early markers of response to therapy

Researchers published a ‘position paper’ in 2014 describing how ASL can be implemented for medical imaging, which Professor Golay says has had a big impact in these terms. “This actually pushed all the main MRI manufacturers to align their methodologies so that they were in-line with the recommendations of this paper,” he says. “So from a whole range of disparate methods in the field, we are now in a position where all three major MRI manufacturers are implementing very similar techniques on their platforms. This will allow the use of these methods in clinical trials, independently of the MRI manufacturers, because we will be able to compare each of the images acquired across all these different platforms.”

Need for calibration phantoms One of the recommendations of the ‘position paper’ coming out of our ASL in Dementia COST Action was the necessity to develop a device to check that the scanners used to measure perfusion really measure blood flow accurately. These devices are called ‘phantoms’, and are normally sold by independent companies. As nobody had ever managed to come up with a reasonable phantom, we decided to set up a spinout company, Gold Standard Phantoms Limited, to sell such phantoms. That company, a true spinout of this Action, has recently been successful in attracting a research contract from the NHS England Small Business Research Initiative (SBRI) scheme worth £100,000, and allowing us to demonstrate the feasibility of producing such devices. The first results will be presented at the last meeting of the Action, on October 4-6, in Airth Castle, Scotland.

www.euresearcher.com

Project Objectives The objectives of the ASL in Dementia COST Action is to improve and validate the Arterial Spin Labelling MRI technology to permit its development as a reliable clinical tool for the diagnosis and follow-up of dementia, providing reproducible and comparable quantitative measurements of cerebral perfusion independent of the hardware manufacturer, the main magnetic field strength at which it is performed, or the particular proprietary technology employed, together with the necessary harmonised post-processing, statistical analysis and cross-validation software to be employed in clinical trials. Project Funding COST ASL in Dementia Action BM1103. Partner Countries Belgium, Croatia, Denmark, Finland, France, Germany, Iceland, Ireland, Italy, Netherlands, Norway, Poland, Portugal, Spain, Sweden, Switzerland, United Kingdom Contact Details Professor Xavier Golay UCL Institute of Neurology 8-11, Queen Square, Box 65 London WC1N 3BG T: +44 020 3448 3449 E: x.golay@ucl.ac.uk W: www.aslindementia.org W: http://asl-network.org

Professor Xavier Golay

Professor Golay is Chair of MR Neurophysics and Translational Neuroscience at UCL Institute of Neurology. His research interests lie at the intersection of many disciplines, such as NMR physics, chemistry, physiology and neuroscience. They include the development of MRI as a translational tool for neurological diseases, measuring identical image-based biomarkers from mouse to human, and from the laboratory to the clinical settings. The COST Action is at the heart of his endeavours to bring new imaging techniques all the way into clinical practice.

21


The recent emergence of improved optical methods means neuroscientists can study individual nerve cells in greater detail than ever before. This work at the cellular level could lead to important insights into how information is stored and coded within the brain, as well as the development of neurodegenerative disease, as Professor Maarten Kole explains

Unravelling the code of nerve cells The question of

how information is stored and coded in single nerve cells is one of the major research challenges in the neuroscience field. Based at the Netherlands Institute for Neuroscience (Amsterdam) and the University of Utrecht in the Netherlands, Professor Maarten Kole is the coordinator of the Encoding in Axons project, an initiative funded by the European Research Council. “Our aim is to study how electrical signals in neurons start, and then flow along entire axons,” he outlines. Axons, a type of protrusion that extends from the main body of neurons, are comprised of several unique and important domains; researchers are investigating single axons to look at how each of these domains work, and to identify their electrical and biophysical properties. “While the terminals at the endpoint of axons, where a chemical transmitter is released, have been explored in great detail over the last decades, we still lack fundamental insights into other regions of the axons, namely the initial segment, the inter-node and the node. So we thought it was a good idea to start our research there at the initial segment,” says Professor Kole. “The initial segment is the only part of the axon which also can be

22

seen with basic microscopy, so it’s the most accessible region of the axon.” The recent emergence of new optical techniques means researchers can now visualize these structures much more clearly than was previously possible. With high-frequency pulsed lasers, researchers can visualize structures of less than a micro-metre up to a resolution of 400 nano-metres. “The higher resolution allows us to see specific regions of the

specific details of the membrane, such as ion channels” outlines Professor Kole.

Initial segment Since all excitable cells have an axon, research into how electrical signals are generated and propagated is ultimately central to understanding how the brain works. Each of the domains within an axon has a very specific role in these terms; the initial segment is fundamental

If you have more myelin you can, for example, steer certain motor tasks more quickly. There was a very interesting study on people who learned to juggle – you have to process multiple information sources; observing the ball very quickly, and respond and control arm movements rapidly and precisely axon and target them more directly. Also the cameras have become faster,” says Professor Kole. Researchers are using these techniques and others to investigate the mechanisms behind information encoding in single axons. “Our research focuses on one axon in particular, that starts within the cortex and in some cases runs all the way down to the end of the spinal cord. We are investigating the

to the generation of electrical signals. “I was involved in the discovery about 10 years ago that the initial segment is really the key part of the axon that starts all electrical nerve impulses. These nerve impulses are also called action potentials,” explains Professor Kole. “Axons start the action potential at a point where they also integrate signals from other cells. These are fundamental tasks of nerve cells,”

EU Research


Brain slice recording

continues Professor Kole. “The axon is a very thin, small structure, and that allows for high frequencies in signaling.” The small size of axons, typically around 1 micrometre in diameter, means that they also have a small membrane area. Professor Kole says this is an advantage in terms of their ability to generate electrical signals. “The rise time of signals becomes faster if it’s generated in a small area. That is why smaller areas have a big advantage, in that they can generate signals more rapidly compared to larger areas, which would need more time to be charged by ions,” he explains. These electrical signals may also change during propagation. “As the axon is conducting electrical impulses it may change the frequencies for example – some axons cutoff high frequencies,” says Professor Kole. “In electrical terms you can think about filtering information. Furthermore, we discovered a while ago that sometimes signals themselves change their width, which has an impact on information transmission.”

Axon myelination But axons not only generate electrical impulses, they also conduct them at high

www.euresearcher.com

speed. While the typical small diameter of axons is an advantage for the generation of electrical impulses, it imposes a disadvantage for conduction. The degree of myelination of an axon is an important issue in this regard. Myelin, a material produced by a specific type of glial cell called an oligodendrocyte, insulates axons, reduces current loss and is thought to aid the conduction of electrical signal impulses; however, the degree of myelination is not uniform across axons. “We know that one part of an axon can be myelinated and not another. Furthermore, there is also variability between axons. About 20 percent of axons in humans are not myelinated,” outlines Professor Kole. Researchers are also considering the level of myelination; the larger the diameter of an axon, the more wraps of myelin there will be around it. “Some axons in the periphery have to conduct signals over several metres, so need to be densely myelinated,” continues Professor Kole. “But sometimes it’s actually required that the signal travels more slowly though, so then an axon is not myelinated.” A major question in research is how axons communicate with glial cells to set up the number of myelin wraps.

Researchers are able to investigate the interaction between these two cell types by making direct electrical recordings from each, and then looking at them in greater detail. “We use primarily slices of the brain, and then we make direct electrical recordings with glass pipettes– it’s called the patch-clamp technique,” explains Professor Kole. It is thought that there must be cross-talk between axons and oligodendrocytes to set up the number of myelin wraps, a research field that Professor Kole and his colleagues are looking into. “How is electrical activity sensed by the oligodendrocyte? And how do the oligodendrocytes and neurons interact?” he continues. “We’re talking about distances of nanometres between these two cell types. That’s challenging to visualise, to image and to quantify with our electrical recordings.” Researchers are also using electrical recordings to test the hypothesis that the degree of myelination influences the mechanisms behind the generation of action potentials. Electrical activity is evoked in the neuron by injecting current, which evokes a particular action potential. “That’s how we assess those kinds of interactions, by externally

23


Computer representation of neuron surface.

At a glance Full Project Title Identifying mechanisms of information encoding in myelinated single axons (Encoding in Axons) Project Objectives The Kole research group investigates the biophysical features of both myelin and the axonal membrane in the rodent brain to identify the role of myelin sheath in the conductive properties of nerve cells. These studies may reveal the speed-enhancing capacity of electrical impulses and the impact of myelin loss in disease. Project Funding European Research Council (ERC) Starting Grant 261114. The National Multiple Sclerosis Society (NMSS) RG 4924A1/1. Contact Details Project Coordinator, Maarten H. P. Kole Netherlands Institute for Neuroscience, Royal Netherlands Academy of Arts and Sciences (KNAW), Meibergdreef 47, 1105 BA, Amsterdam, The Netherlands T: + 31 20 566 4594 E: m.h.p.kole@uu.nl W: http://cellbiology.science.uu.nl/ research-groups/maarten-kole W: http://orcid.org/0000-0002-3883-5682 Kole MHP, Ilschner SU, Kampa BM, Williams SR, Ruben P, Stuart GJ (2008). Action potential generation requires a high axon initial segment sodium channel density. Nature Neuroscience 11 (2):178-86. Kole MHP, Stuart GJ (2012). Signal processing in the axon initial segment. Neuron 73 (2):235-47

Maarten Kole

Maarten Kole performed (2004 – 2011) his postdoctoral studies in subcellular electrophysiology at the Australian National University in Canberra and in 2011 established with ERC funding his research group at the Netherlands Institute for Neuroscience (Amsterdam). In 2014 he was assigned as Professor in Biophysics of Complex Cellular Interactions at the University of Utrecht.

24

evoking activity or an action potential within a neuron, then looking at what occurs in the oligodendrocyte,” says Professor Kole. Evidence suggests that more myelin is produced when the brain engages in new tasks, and Professor Kole says the degree of myelination has an impact on our ability to process information. “If you have more myelin you can, for example, steer certain motor tasks more quickly. There was a very interesting study on people who learned to juggle – you have to process multiple information sources; observing the ball very quickly, and respond and control arm movements rapidly and precisely. We think myelin is very important to that,” he outlines.

Neurodegenerative disease This work holds also significant implications for our understanding of neurodegenerative disease. Professor Kole and his colleagues are investigating what happens when myelin is lost or damaged, as occurs in multiple sclerosis (MS). “MS probably starts with a fault in the auto-immune system which attacks the myelin sheath. But what happens if an axon is not myelinated? We recently published a study where we looked at the electrical properties of axons when they were de-myelinated,” he outlines. “We found that when myelin is lost the nerve impulse is conducted slower – as you

would expect. But what was surprising was that we also found that axons become more active – so they produce more action potentials at the same time. Is that because axons have this dual function – on the one hand they start action potentials, on the other they also conduct action potentials? This is something that we think is very important.” A patient with MS might have lost myelin in certain areas of the brain, which researchers predict would be accompanied by increased axon activity. This could explain why some patients have cognition deficits. “Increased axonal activity may be disadvantageous – it could actually reduce precision, or reduce the responses to certain impulses, or increase the risk of seizures. About 50 percent of MS patients have cognition problems. That’s something that we think is also related to our findings,” says Professor Kole. While Professor Kole plans to pursue further research into the underlying causes of neuro-degenerative disease, he says many fundamental questions remain about axon properties. “The spatial resolution of the new imaging techniques that are emerging is rapidly improving. So in the future we’ll be able to investigate very fundamental questions about axons,” he outlines. “But at the same time, if we are also able to better understand how axons work, then I think that will also have implications for how we understand disease.”

The axonal signalling group.

EU Research


The incidence of heart failure is rising, yet our understanding of it remains relatively limited. Professor Thomas Eschenhagen tells us about the IndivuHeart project’s work in developing a clinically applicable test, which could be a step towards individualised risk prediction and therapy for heart failure

Getting to the roots of heart failure The development of induced pluripotent stem cell (hiPSC) technology means researchers can now look at the function of individual cells in the body in greater detail than ever before. This technology holds enormous potential in terms of assessing an individual’s risk of suffering from heart diseases, as Professor Thomas Eschenhagen, Principal Investigator of the IndivuHeart project, explains. “With this methodology we think we can go one step further than current approaches, because we get information on the function of a differentiated human cell,” he says. The project aims to use this technology to develop a clinically applicable test to identify people at risk of suffering heart diseases. “We can get information on individual cardiac function, which could be helpful in telling the patient whether a particular mutation is serious and requires treatment, or if it’s harmless,” continues Professor Eschenhagen.

Heart failure This is central to building a deeper understanding of both systolic and diastolic heart failure. Heart failure is a very heterogenous condition, in which

many factors can play a role; however, Professor Eschenhagen says our understanding of it is actually quite limited. “Systolic heart failure is very heterogenous. It could be a consequence of myocardial infarction or hypertension, for example, or of many gene defects or the combination thereof. We lack a fundamental understanding of diastolic heart failure,” he outlines. The heart beats apparently normally in cases of diastolic heart failure (HFpEF), but patients still suffer from symptoms such as heart congestion, shortness of breath and oedema. “We don’t really know what’s going on there, and all the therapies that have been developed so far have failed. So this is really a major blank spot in cardiology,” says Professor Eschenhagen. The project is performing in-depth clinical phenotyping and genotyping of patients with cardiomyopathies or HFpEF, aiming to gain deeper insights in this area than currently possible. While genes can be analysed in detail, and certain mutations are known to be associated with certain diseases, there are limits to existing methods. “We often don’t know if a mutation we see is

meaningful for an individual, or whether it’s associated with a good or bad prognosis. So the genotype-phenotype correlation is pretty poor,” explains Professor Eschenhagen. The line between monogenic diseases and acquired forms of heart disease may also be more blurred than previously assumed, making it difficult to establish a direct genotypephenotype correlation. “If there was a very sharp border, and some mutations always made people sick, then there would be no problem. But this is not the case,” says Professor Eschenhagen. A mutation can have very severe consequences in one individual, but no impact in another, demonstrating the challenges of basing risk predictions solely on genetic analysis. Looking directly at the function of the relevant cell type – in this case cardiac myocytes – would be significantly easier and would provide richer information, but Professor Eschenhagen is keen to point out that there are also challenges involved with the biomedicine approach. “The procedures involved in the project, from the skin biopsy via reprogramming to cardiac myocytes, are highly complex, with a lot

IndivuHeart team at the University Medical Center Hamburg Eppendorf.

www.euresearcher.com

25


Principle of making cardiomyocytes from somatic cells.

The inherent assumption of using hiPSC for modelling inherited sarcomeric cardiomyopathies is that the effect of the sarcomeric mutations during cardiac development in vivo is the same as its effect on cardiac function in the dish, which cannot be valid. Only certain aspects of disease pathology will be approachable by the hiPSC technology. Eschenhagen, Mummery, Knollmann, Cardiovasc Res 2015. of potential for mistakes and variability,” he points out. Moreover, cardiac myocytes from HiPSC are relatively immature and there are limited possibilities to evaluate their function in the dish. So how can researchers decide whether the differences they see between individual cell lines are really due to the individual´s genetic make-up, to chance or to an artefact? One answer is CRISPR/Cas 9 technology, a method by which scientists can cut the genome at a specific location, to investigate specific mutations related to heart diseases. “So we make a parallel clone, where we take out this mutation and then we compare it to a healthy cell. Then we should be able to see the influence of this single mutation,” says Professor Eschenhagen. However, researchers can’t use this approach to analyse the impact of the entirety of genetic or epigenetic variants on a phenotype, the main goal of the IndivuHeart project.

Cell function The project combines an observational clinical study with the engineered heart tissue (EHT) technology that was developed by Professor Eschenhagen 20 years ago. Human heart muscle strips made from HiPSC-derived cardiac myocytes, EHTs beat spontaneously, develop force and can be monitored by automated video-optical analyses. With automation and standardisation, analysis of EHT function is simple and robust, which reduces variability and increases the sensitivity of the method.

26

Cells are generated from little skin biopsies taken from 40 patients and 40 healthy volunteers who also undergo an in-depth medical examination including ECG, echocardiography and magnetic resonance imaging of the heart, as well as genetic examination and a follow-up examination five years later. Skin fibroblasts are reprogrammed to HiPSC and differentiated to cardiac myocytes. The whole process is time-, labour- and cost-intensive, but has improved dramatically over the past few years and now generates hundreds

Professor Eschenhagen says this effectively mimics the situation in the developing heart. “The heart has to surpass a certain resistance when it’s contracting, and that’s what we’re mimicking here with these elastic silicone posts. We know that this is very important to the development of good heart tissue,” he outlines. EHT function can then be evaluated under various conditions. This study is taking place under blinded conditions, meaning researchers are unaware whether a specific sample has been assigned to a patient until the end of the study.

We can get information on individual cardiac function in addition to standard medical tests, which could be helpful in telling the patient whether a particular mutation is serious and requires treatment, or if it’s harmless of millions of cardiac myocytes. “To make EHTs, we mix the cardiac myocytes with fibrinogen, which is a clotting material, and thrombin. This quickly forms a gel,” continues Professor Eschenhagen. “The gel traps the cells in a three-dimensional space. We cast this mixture into little casting wells in between two elastic silicone posts which are inserted into the wells from a silicone rack positioned on top of the culture dish.” The cells grow and, over two weeks, form a heart tissue-like structure, anchored on these silicone posts. This growing tissue is under continuous mechanical load from the silicone posts;

The hope is that with this method researchers will be able to define normal values for a whole spectrum of EHT functions and see whether certain individuals deviate from these norms. “The EHT function of the 40 healthy controls should be more or less the same,” explains Professor Eschenhagen. “It will be critical to see how similar they really are and then how large differences are between control and diseased EHTs, under various conditions.” These studies will reveal standard values for HiPSC – EHT function under both basal and stress conditions. “We envision two major stress interventions here, which closely reflect what happens in patients,”

EU Research


At a glance Full Project Title Individualized early risk assessment for heart diseases (IndivuHeart) Project Objectives The project aims at defining normal values of human cardiomyocyte function generated from induced pluriptent stem cells and disease-specific deviations from the norm. Project Funding The project is funded, mainly by the ERC AG grant. In addition, the overall work on the subject of disease modelling has been supported by the German Research Foundation (DFG, Es 88/12-1), the European Commission (FP7 Angioscaff, FP7 Biodesign), the Freie und Hansestadt Hamburg and the German Centre for Cardiovascular Research (DZHK) by the Bundesministerium für Bildung und Forschung (BMBF).

Fibrin-based engineered heart tissue (EHT) as an automated, high content readout of functional parameters of hiPSC-derived cardiomyocytes in a three-dimensional heart muscle construct. (A) Setup to measure spontaneous or electrically stimulated contractile activity of EHT cultured around elastic silicone posts in a 24-well format over extended periods. Note a temperature-, gas-, and humidity-controlled incubation chamber and a PC-controlled video camera with XYZ drive above. (B) Image of a hiPSC-EHT between two silicone posts as viewed by the video camera. Note the muscular structure. (C) Overlay of averaged contraction peak (black) and Ca2+ transient (red), normalized to their respective maxima. From Stoehr et al. Am J Physiol 306:H1353-63, 2014. (D) Dystrophin-stained heart muscle structure of hiPSC EHT. Note longitudinal orientation and cross-striation. From Hirt et al. J Mol Cell Cardiol 74: 151-161, 2014. says Professor Eschenhagen. “One is of course drugs. So we are going to put the cells under the influence of catecholamines like adrenaline, a type of stress hormone.” This results in the heart beating faster and while in the short-term this has beneficial effects, continuous stress is known to be bad for the heart. Researchers will also test other drugs, including erythromycin; while it is commonly used to treat bacterial infections, erythromycin also has rare life-threatening pro-arrhythmic effects, and it is not currently possible to identify who is at risk. The other stress intervention will be mechanical. Researchers will stiffen the elastic silicone posts the EHTs are attached to and against which they perform their contractile work. “That would effectively mimic a strong type of hypertension, or aortic stenosis. So classical diseases which also lead to heart failure,” outlines Professor Eschenhagen. “It is very likely that the magnitude and maybe even the quality of responses is genetically

www.euresearcher.com

determined. We hope to see that EHTs from our healthy volunteers are fine, while EHTs from patients with heart problems have a more drastic reaction.” The ultimate goal of the project is to integrate these findings into clinical care, helping tailor treatment to individual needs. In cases where it’s difficult to decide on the right course of treatment, or whether it’s required at all, then the IndivuHeart project’s work could have a real impact. “You could take a little skin biopsy, do all these tests, and then check. If it turns out that the HiPSC – EHT function deviates from the norm, let´s say in terms of rhythm or stability, then you would take this as a reason for the early implantation of an ICD (Implantable Cardioverter Defibrillator),” says Professor Eschenhagen. “If it’s very stable though, and reacts to only a very few drugs which are known to be dangerous, like erythromycin, and only at high concentrations then you could say it’s enough just to control once in a while and avoid drugs such as erythromycin.”

Contact Details Project Coordinator, Thomas Eschenhagen, MD, Professor, Director, Department of Experimental Pharmacology and Toxicology, University Medical Center Hamburg Eppendorf, Martinistr. 52, 20246 Hamburg, Germany. T: +49 74105 2180 E: t.eschenhagen@uke.uni-hamburg.de W: www.uke.de/institute/pharmakologie/ index.php

Thomas Eschenhagen, MD

Thomas Eschenhagen, MD, is currently Professor of Pharmacology and Director of the Department of Experimental Pharmacology and Toxicology at the University Medical Center Hamburg Eppendorf, Germany. He is also chairman of board of directors of the German Centre of Cardiovascular Research (DZHK). Dr. Eschenhagen has concentrated his research efforts on understanding molecular mechanisms of heart failure with a focus on β-adrenergic signaling, its adaptation in heart failure and consequences on contractile function.

27


The D2 dopamine receptor in the kidney is thought to protect against renal injury and inflammation, while genetic variants of the receptor can lead to a decrease in its expression. The presence of these variants can mean people are more susceptible to renal inflammation, fibrosis and ultimately renal injury, as Dr Ines Armando of George Washington University explains

Understanding the kidney’s protector A chemical naturally

produced in the body, dopamine plays a number of important roles in the brain and body, including functioning as a neurotransmitter. Based at the George Washington University School of Medicine, Dr Ines Armando is the coordinator of a research project investigating the role of the dopamine D2 receptor in renal inflammation and injury. “This project is related to the protective effects of one of the dopamine receptors in the kidney, the D2 receptor,” she outlines. The renal dopamine receptors are closely involved in natriuresis, the process by which sodium is excreted in the urine, and diuresis, the increased production of urine, both of which have a significant impact on individual health. “Scientists have discovered that there is a system in several tissues called the peripheral dopaminergic system that, among other things, is able to produce dopamine from the precusor, L-dopa,” continues Dr Armando. “The kidney itself has a very active renal dopaminergic system, that is involved in the control of salt and water excretion, which in turn has an impact on blood pressure.”

Oxidative stress The renal dopaminergic system is also involved in other processes, which are the focus of continued research. Some of its receptors are known to regulate the production of reactive oxygen species, which can lead to oxidative stress in the kidney; Dr Armando is now building on these findings to look at the regulation of inflammation. “Many diseases have an inflammatory component, or are related to native immunity. There are several models with which you can prove that inflammation can lead to the development of hypertension for example,” she says. The relationship is complex, with

28

hypertension itself also leading to inflammation; researchers have found that the D2 receptor has a protective effect against inflammation, and also that people with a lower expression of it are more vulnerable to inflammation. “Genetic variants of the receptor can lead to a decrease in the expression of the D2 receptor. Three particular variants of the D2 receptor are related to a decrease in the expression of the receptor,” outlines Dr Armando. “Cells from the subjects with these variants have less receptors than those from people who don’t have these variants.”

targetting what is generating this inflammation, or fibrosis, is maybe more realistic,” outlines Dr Armando. “We have found that several systems are involved in determining what these receptors produce.” A number of papers have been published in this area recently, including one on how the D2 receptors regulate the expression of a particular micro-RNA that eventually regulates fibrosis, and research is ongoing into other systems. The project is pursuing research on the regulation of apoptosis – cell death – with respect to the D2 receptors, as well

Many diseases have an inflammatory component, or are related to native immunity. There are several models with which you can prove that inflammation can lead to the

development of hypertension These cells are a lot more prone to developing inflammation, fibrosis, and ultimately renal injury, underlining the wider importance of Dr Armando’s research. These dopamine D2 receptors are located in several parts of a nephron – the fundamental unit of the kidney – but Dr Armando and her colleagues are focusing their research on the renal proximal tubule. “Cells in the proximal tubule play a central role in the control of water and salt excretion,” she explains. Researchers aim to establish whether deficient function of the dopamine D2 receptor enhances the inflammatory reaction, and if it is a key factor in renal inflammation and injury; this work could be relevant to treatment. “If we can find the mechanisms behind renal inflammation, we can target those mechanisms instead of targetting the receptor. Increasing the amount of receptors may not be that easy. But

as into specific signal transduction pathways. “Several signal transduction pathways are altered by the absence of these D2 receptors, and they regulate the expression of inflammatory factors,” says Dr Armando. Researchers are using animal models to investigate the role of the D2 receptor in renal injury and inflammation. “We deleted the receptor in mice and they developed kidney injury and hypertension. Then we re-introduced the receptors using a virus that carried a plasmid containing the D2 receptor,” continues Dr Armando. “When we replaced the receptor, we found that the renal injury became a lot less severe. Some of the effects of renal injury cannot be reversed. However, what reintroducing the receptor did achieve was to stop the damage progressing and prevent the injury getting worse.” This approach could hold potential as a means of treating kidney disease in

EU Research


At a glance Full Project Title Role of Dopamine D2 Receptor in Renal Inflammation and Injury (The Role of Dopamine Receptors) human patients, yet many technical challenges remain before it can be widely applied, and researchers continue to investigate other possible methods of treatment. Dr Armando says another area being investigated is the development of drugs that don’t cross the blood-brain barrier. “This is because D2 antagonists – drugs that affect the D2 receptor – are currently used as anti-psychotics. They can cause unwanted side-effects,” she outlines. Dopamine agonists are not heavily used at the moment, except for a few used to treat increased levels of prolactin, a hormone that is regulated by dopamine; Dr Armando believes they could eventually be used to treat chronic kidney disease. “If we can develop dopamine agonists that target just the kidney, do not cross the blood-brain barrier and that can be put in nanoparticles – or in some other kind of molecule – then they could potentially be used to treat chronic kidney disease. It may not completely reverse it – there are very few things you can do once you have it – but it could prevent it from progressing further,” she says.

Kidney transplant The main goal in managing chronic kidney disease is halting its progression, partly through controlling any existing conditions, such as diabetes and high blood pressure. However, in advanced cases of chronic kidney disease, a transplant may be required; this is an area Dr Armando would like to investigate further. “We could look at people who have had kidney transplants, and whether the transplanted kidney has these variants that decrease the number of receptors. If an individual has a tendency to inflammation and fibrosis, it is possible that their transplanted kidney won’t last that long,” she outlines. There is a long waiting list for kidney transplants, and it’s not generally possible to wait for the ideal organ; however, Dr Armando believes it is possible to match a donated kidney more closely to the physiology of the patient. “If we know whether these variants are present before the transplant, and we are able to develop these other drugs, then that may help the transplanted kidney to survive, yet there’s a lot of work to do on this. We have to find a population, talk to the people, convince them that it’s important, and get the funding to do it,” she outlines.

Adverse Stimuli (e.g., Hypertension, High NaCl diet) D 2R

Cytokines / Chemokines

Recruitment of inflammatory cells

Inflammation

Fibrosis End organ damage

Proposed effects of D2R-mediated prevention/amelioration of renal inflammation and end organ damage. Several stimuli activate renal cells to produce cytokines and chemokines that have potent inflammatory effects, induce the recruitment of inflammatory cells resulting in tissue inflammation and end organ damage (red arrows). The D2R negatively regulates the production of cytokines and chemokines decreasing the infiltration of inflammatory cells thus promoting a normal kidney (green arrows).

www.euresearcher.com

Project Objectives To study factors that prevent the development of renal inflammation and hypertension and slow the progression of kidney disease. Project Funding the Role of Dopamine Receptors is funded by the National Institute of Health, USA. Contact Details Project Coordinator, Dr Ines Armando PhD Associate Professor of Medicine The George Washington University School of Medicine and Health Sciences. Washington, DC, USA T: +00 (410) 706 6013 E: iarmando@gwu.edu

Dr Ines Armando PhD

Project Coordinator

Dr Armando received her PhD in Biochemistry from the University of Buenos Aires, Argentina and postdoctoral training at the University of London, UK, and the National Institute of Health, USA. Dr. Armando has considerable expertise in the physiology, pharmacology and biochemistry of the stress response and in central and peripheral mechanisms of regulation of blood pressure and renal function, in particular, the role of the renal dopaminergic system which has been her primary research interet for several years.

Dr. Armando has authored more than 130 original papers in high quality journals and about 15 reviews and book chapters. Her research and contributions have been recognized by a number of awards including several national and international grants, and memberships in Advisory Boards and Organizing Committees of several international scientific meetings.

29


Researchers continue to investigate new gene therapy methods, aiming to develop more effective means of correcting genes as a way of treating disease. Dr Fulvio Mavilio, coordinator of the GT-SKIN project, tells us about their work in developing a new gene targeting and gene correction technology, which could be used in the future to treat genetic diseases

A new age in gene therapy Recent years have

seen a renewal of interest in gene therapy, with researchers investigating new methods to deliver drugs directly into patient’s cells as a means of treating disease. Based at Genethon, a non-profit research organization in Paris, Dr Fulvio Mavilio is the coordinator of the GT-SKIN project, an ERC-funded initiative that aims at developing a new gene targeting and correction technology. “The objective of the project is to develop a technology for direct editing of the human genome. Gene therapy today is based on what we call gene replacement technology. This means that we just add a normal copy of a functioning gene into cells,” he explains. Many genetic diseases are caused by the failure of a specific gene to function properly, particularly in the skin and the blood systems, the two systems being addressed in GT-SKIN. “We are now developing technology based on so-called ‘designer nucleases’, essentially enzymes that can introduce corrections into the genome. With this technology we aim at correcting mutations directly in the affected genes,” continues Dr Mavilio. “The most promising technology available – or at least the most flexible and the most adaptable to many different cell systems and genes – is called CRISPR/Cas9.”

CRISPR/Cas9 Cas9 is essentially a bacterial-derived nuclease that is guided to the human genome by small sequences of RNA that can be easily synthesised in the laboratory. With the CRISPR/Cas9 technology, researchers can cut the genome at any location; the technology has so far mainly been used for research purposes, now Dr Mavilio aims to widen its therapeutic potential. “What we are trying to do with this project is to adapt this system and develop it into something we can use in patients,” he explains. Researchers are

30

targeting two types of cells, blood stem cells and skin stem cells, both of which can be obtained from patients, genetically modified and re-transplanted with relative ease. “We target the stem cells of the skin to correct genetic defects that cause a family of diseases called epidermolysis bullosa. Patients affected by these diseases develop severe blisters – the skin cannot stay attached to the body because it lacks the adhesion complex necessary to keep it firmly attached to the dermis,” continues Dr Mavilio. “The disease develops if there are mutations that affect this skin adhesion system. Patients become sensitive to any sort of infection, have a very low quality of life, and often develop deadly skin cancer. Currently there is no cure whatsoever.”

derive skin grafts and transplant them as we did in our previous approach, but without the problems typically associated with viral vectors, in particular their complexity, manufacturing cost, potential toxicity and limitations in the size of the genes they can transfer into stem cells,” says Dr Mavilio. “Gene editing may become a simpler, more economical and safer form of gene therapy.”

Blood stem cells The second application of the project’s research is blood stem cells that, like skin stem cells, can be obtained from patients relatively easily. The goal is again to correct genetic defects in these cells, which cause immune deficiencies or hemoglobinopathies like thalassemia and

The objective of the project is to develop a technology for doing direct gene editing instead of gene replacement, in which we add a normal copy of a functioning gene into cells by complex vector-based technology A classical gene therapy approach was previously used to treat the disease, in which a normal copy of the mutated gene was introduced by a vector in the patient’s skin stem cells. The corrected stem cells were grown in the laboratory to generate a piece of epidermis that could then be grafted on to the patient. While this worked in a pilot clinical trial, Dr Mavilio says there are practical problems with this approach. “It’s a complex process – the genes that are affected in these diseases are very large genes. So they are difficult to transfer, and the viral vectors we used may potentially cause undesired side effects,” he outlines. Researchers hope the CRISPR/cas9 will prove to be a safer approach. “We aim at correcting the cells by genome editing instead of gene replacement and then

sickle-cell anaemia. “These are important, relatively common genetic diseases,” stresses Dr Mavilio. While the research focus is on blood and skin stem cells, Dr Mavilio says this approach could be used on other genetic conditions. “Gene editing could be used in any genetic condition, provided that we have the tools to deliver the Cas9 nuclease and the guide RNA – so the complex of molecules that can do genome editing – into the relevant cells,” he outlines. “The reason we target stem cells is because we can take them out, manipulate and re-transplant them by safe procedures that have been used by clinicians for decades. Stem cells find their way to re-constitute the tissues – skin in one case and blood in the other case. Unfortunately, there are limitation to the use of this approach: in the case of

EU Research


neuromuscular or central nervous system diseases, for example, there is no stem cell that we can correct and re-transplant – this is another level of complexity.” There are groups at Genethon working on these diseases, yet the focus for the GTSKIN project remains on blood and skin stem cells, and on demonstrating the efficacy of its approach. Gene-corrected skin or blood stem cells will be tested on pre-clinical models where they are transplanted on immuno-deficient mice. “The reason for using immuno-deficient mice is that they cannot reject the transplant as their immune system is impaired. We can grow little pieces of skin on the back of these mice, and prove that the skin stays attached and doesn’t produce any blisters,” explains Dr Mavilio. The same approach is taken with the blood stem cells. “Again, we transplant the cells into immuno-deficient mice, in which blood eventually comes in part from mouse stem cells and in part from the human stem cells that we transplant. So we have a sort of chimera in which we can study the property of tissues derived from the transplanted stem cells,” continues Dr Mavilio. “We can test in a living animal whether skin or blood stem cells carrying the gene correction we have introduced using the CRISPR/Cas9 system restores their normal functions”

This research not only holds enormous relevance in terms of the project’s objectives, but will also create a wider knowledge base on the molecular aspects of epidermal and hematopoietic stem cells biology. While the potential of these cells for gene therapy is widely recognised, some important questions remain before it can be more widely applied. “We don’t know enough about skin and blood stem cells – what is their genetic programme? What genes do they express? What combination of genes gives these cells their property of being a stem cell?” asks Dr Mavilio. Researchers aim to complete the proof-of-principle study over the next few years, and then look towards the wider potential of their work. “We will use the ERC funding to complete the risky phase – to prove the principle, the idea, and then start working with industrial partners to translate the technology into clinical applications,” explains Dr Mavilio. “Industry tends to come in when an approach has been proved to be feasible – the big pharmaceutical companies are not particularly interested in paying for the very risky initial phases of developing a new technology. This is particularly true in the field of rare diseases, where the prospective market is by nature very small.”

At a glance Full Project Title Gene Therapy for Inherited Skin Adhesion Disorders (GT-SKIN) Project Objectives The objective of the GT-SKIN project is the development and pre-clinical evaluation of genome editing as the next-generation gene therapy technology. The anticipated output is the development of nextgeneration technology for safer and more efficacious gene therapy, and the establishment of a knowledge base for better utilization of stem cells. Project Funding European Research Council Project Partners • CRISPR Therapeutics, Stevenage, UK Contact Details Fulvio Mavilio, PhD Scientific Director Genethon 1bis rue de l’Internationale 91002 Evry, France T: +33-1-69472978 E: fmavilio@genethon.fr W: www.genethon.fr

Fulvio Mavilio, Ph.D

Human skin derived from stem cells genetically modified to express a fluorescent (green) transgene in the basal layer, grafted onto an immunodeficient mouse. Nuclei are stained in blue. Fulvio Mavilio, Ph.D., is Scientific Director of Genethon (Evry, France), and Professor of Molecular Biology and University of Modena and Reggio Emilia (Modena, Italy). He was Director of Discovery of Molmed S.p.A. (20022005) and founder and Chief Scientific Officer of Genera S.p.A. (1999-2002). He had previously served as co-Director of the San RaffaeleTelethon Institute of Gene Therapy in Milan (1995 to 2002), as director of the Molecular Hematology unit of the San Raffaele Institute (1989 to 1995), as Visiting Scientist at the Wistar Institute, Philadelphia, (1986 to 1988), and as group leader in the Department of HematologyOncology of an the Istituto Superiore di Sanità in Rome (1984 to 1988). Prof. Mavilio graduated in Biology at the University of Rome in 1976, and obtained a Ph.D. in Medical Genetics at the same University in 1979.

www.euresearcher.com

31


Hearts and Minds The various cardiovascular diseases (CVD) combine to comprise the world’s biggest healthcare challenges and many European research efforts are focused on developing methods for patient specific diagnosis and treatments. From computer simulations to growing artificial heart muscles – there are some ground breaking projects underway that intend to refine our ability to understand and manage the diseases which affect so many – but in a way that is personalised to each patient. By Richard Forsyth

C

ardiovascular disease (CVD) represents a collection of heart and blood vessel conditions that lead to heart attacks and strokes. First – let’s take a look at the dreadful statistics in Europe. Over 4 million people die from CVD in Europe each year which represents 45% (almost half) of all European deaths. CVD is the main cause of death in women in all the countries of Europe and is the main cause of death in men in all but six countries. It also leads to terrible, dramatically life altering disabilities. Economically – the ripple effect from CVD is equally devastating. Combining healthcare costs, productivity losses and informal care, CVD is responsible for more than 196 billion Euros lost annually across Europe. Research projects can potentially create new methods for preventing and managing CVD and research has lessened the impact of these diseases in the last decades, despite the scale of the problem. This is widely recognised and is the reason that continued and significant investment in research around CVD is seen by European authorities as justifiable. It’s no secret that lifestyle can affect risk of CVD, so smokers, heavy drinkers, those consuming what is perceived as bad diets with no exercise – all fall into a risk zone. There are underlying genetic factors at play with susceptibility to diseases and so great efforts have been and are being made to discover new ways to diagnose and treat people who may develop or who have developed CVD. A major challenge in this field is in poor prognosis. Another problem is that response to treatment is not a one-size-fits all. Treatment is not homogenous and so each patient needs to be looked at in isolation for a specific tailored diagnosis and treatment plan. It is recognised that to successfully diagnose, treat and care for patients with CVD, a way of assessing an individual’s heart at a genetic level and revealing flaws in the cardiovascular system is the goal.

INDIVIUHEART’s proposed innovative analysis will rely on the development of a personalised three dimensional model of an individual’s heart which will allow for the detection of the recognised variants that can cause problems. Skin cells are taken from patients who have hereditary heart disease. These cells are then transformed to stem cells from which three-dimensional steady beating heart muscles are grown. These heart muscles will then be compared to the artificially grown heart muscles from healthy people.

The Personal Touch With around 22,000 genes playing a part in the makeup of the human heart, there are certain mixtures of these genes that determine risk to disease. To really understand someone’s likelihood of developing heart disease it is therefore imperative that their personal gene signatures are studied. That is why projects like INDIVIDUHEART, an ERC funded project pioneered by Prof Thomas Eschenhagen are setting out to devise an analytical method to assess an individual’s personal chances of suffering heart failure.

32

EU Research


Skin cells are taken from patients who have hereditary heart disease. These cells are then transformed to stem cells from which three-dimensional steady beating heart muscles are grown. These heart muscles will then be compared to the artificially grown heart muscles from healthy people

This will allow for detecting defects and also testing drugs and new therapies. Prof Eschenhagen is convinced detecting small differences between sick and healthy heart muscles will create a way for physicians to make decisions about treatments, which are tailored specifically for an individual patient.

Big Computers For Big Challenges The human heart is basically a pump but the way it functions still intrigues scientists – as the organ that has the role of the engine room for our entire cardiovascular system. Understanding its biological complexities is a priority for healthcare knowledge. There are several projects around the world in recent years that have created accurate heart simulation. One heart modelling initiative called the MyHeart project, a FP6 project that ended in 2008, helped drive the development of new generation computers – as the project became the application in mind for a new super computer being built in Japan. The challenge required peta scale computing power. This shows the level of research needed for accurate heart function analysis. It’s not a mundane challenge to model this organ, so central to the human physiology. In 2012 – the world’s most powerful computer at the time, Sequoia, a US defence lab computer that was built to model nuclear explosions – was tasked by scientists to build a simulation of the heart down to the cellular level to see how it responded to certain drugs. A research team at Lawrence Livermore National Laboratory modelled the electrical signals travelling from cell to cell that caused contraction. Amazingly – a state of the art digital simulation could simulate an hour of heart activity over 7 hours. Prior to this, such models took 45 minutes to simulate one beat. The implications are great – drugs and pacemakers could be tested before being tested on human beings for instance. Take also, the Alya Red project that came to fruition in 2013. Technologists at the Barcelona Supercomputing Centre worked closely with doctors and experts in clinical imagery to perfect the software. The research team relied on the phenomenal computing power from MareNostrum – a supercomputer that ran at one petaflop through 50, 000 Intel processors. Initially – the imagery and data needed was obtained from an f MRI scanner to map the human heart into a 3D model. The computer simulation made it possible to study how muscular fibres play a role in heart function.

Usable Technology Similarly, INTEG-CV-SIN, an FP7 project currently running, seeks to exploit the power of computer modelling to address patientspecific techniques. The project’s broader explanatory heading is: An integrated Computer modelling framework for subject-specific cardiovascular simulation: Applications to Disease Research, Treatment Planning and Medical Device Design.

www.euresearcher.com

33


In 2012 – the world’s most powerful computer at the time, Sequoia, a US defence lab computer that was built to model nuclear explosions – was tasked by scientists to build a simulation of the heart down to the cellular level to see how it responded to certain drugs Clinical diagnosis relies more on invasive techniques of sampling or imaging. This is true of disease research also. Additionally, medical device manufacturers rely on in-vitro models to work out the anatomical variations for the design of stents and stent-grafts. INTEG-CV-SIN seeks to create a computer modelling simulation which can be adapted to each patient’s personal criteria. There would be a wide variation of boundary conditions that could represent alterations in the physical state of the patient. Set in the clinical environment this computer modelling would complement traditional methods of patient diagnosis. The truth is, whilst patient specific modelling in the clinical environment shows great promise it is currently limited in practice. Whilst it is already possible to create incredible heart simulations, the successes have historically taken extraordinary computing power and combinations of imaging, IT and clinical expertise working harmoniously in sync. These different fractions of expertise often consist of different work cultures and are all highly technical fields relying on specific training to be understood. Getting the different disciplines aligned such as IT and medicine is therefore a challenge in itself – especially when technology needs to be accessible to a clinician’s workspace.

34

The Whole Human Of course, if we are going to create patient specific modelled simulations of organs then why not go a step further and create the whole human anatomy – so we truly understand the cause and effect processes through the human body. The European Commission has great aims and therefore substantial funding infested in projects for what it calls Virtual Physiological Humans (VPH) where complete graphical models of human beings are created – from molecular to tissue layers to organs, to the way those organs interact in different situations. The idea is similar to some of the projects already discussed in that a person’s personal information can be fed into the model to see how it reacts to treatments. You should be able to feed information from various sources such as CAT scans, X-Ray and ultrasound into the VPH. The VPH approach seeks to treat the human body as a multi-organ system as opposed to a collection of organs – each in isolation. The work on VPH demonstrates a powerful tool to really transform the way we detect health problems and experiment with treatment pathways.

Huge Benefits The overriding aim is that Europe will have a greatly improved healthcare system from the research in these areas. We could all benefit from personalised care solutions which would be more

EU Research


accurate and holistic. Simulation of patient specific physiology presents an opportunity to alleviate suffering, speed up treatment and truly understand the internal processes of the a human body. Using comprehensive biomedical data from a patient in order to simulate potential treatments and outcomes would mean the patient may be able to avoid experiencing unnecessary treatments that will be ineffective. As a side – this presents issues in the collection, storage and use of patient data which will need careful management but the gains of this data-centric approach are easy to understand and appreciate in this field. The personalised approach means that we would be safer and drugs would be more efficient. The projects would also enable an attractive preventative approach to disease. In addition, computer simulation of drug testing could also reduce the requirement to experiment on animals in clinical trials, an area in research that often sparks controversy in the public eye and many people campaign rigorously for improved animal rights in this context. Further to this – through patient specific simulations we hope to understand the interconnectedness of our anatomy – for instance, how the heart affects the rest of the body. Looking into the future, the numerous research projects in patient specific modelling should help to bring the frequency of strokes and deaths down from their alarming numbers at present. There is still a way to go. We will need to develop much in the way of procedures, interfaces and software for computers that feed off patient data if we are to truly create personal and effective treatments. One thing for certain is that the less we leave to chance in treatment choices, the more we plan for success in our continued battle against Europe’s biggest killer, CVD.

www.euresearcher.com

The Fascinating Nature Of The Heart

The heart is the centre of the cardiovascular system. With its four valves thumping, the heart beats around 100,000 times a day and during an average lifetime it pumps about 1 million barrels of blood. Its job is to get blood to nearly all of the body’s 75 trillion cells. The power output of the heart ranges from 1-5 watts. The heart has its own electrical impulse which means as long as it has a good supply of oxygen it can continue to beat outside of the body.

35


EU Research

For more information, please visit: www.euresearcher.com

EU


Extending the frontiers of computer vision learning

The traditional approach in the computer vision field has been to learn about individual concepts in isolation, after which the computer starts again from scratch. The VISCUL project is developing a new approach where a computer uses its pre-existing knowledge when learning about new concepts, as Principal Investigator Dr Vittorio Ferrari explains

The human sense

of vision is highly advanced, allowing us to rapidly identify objects and build our knowledge of what they represent. The traditional approach in the computer vision field has been to learn about individual concepts in isolation, after which the learning process starts again, but now researchers in the VISCUL project are developing a new approach to visual learning in computers. “We’re developing a new approach in the project, which is about computers trying to learn new concepts based on everything they have learned before. So if the computer already knows about bicycles and horses, maybe that will help it learn about cars for example,” explains Dr Vittorio Ferrari, the project’s principal investigator. This knowledge base continually grows, acting as a foundation for further learning, helping the computer to identify and

www.euresearcher.com

interpret images in a scene. “The knowledge base helps the computer learn new things. So in a way it’s about learning to learn,” continues Dr Ferrari. “The computer can not only interpret a new scene, it can also learn new elements that are in that scene, thanks to the elements it already knows.”

Full supervision This point is well illustrated by the similarities between cars and trains. If a computer has not seen a train before, but knows what cars look like, then Dr Ferrari says it can use this pre-existing knowledge. “Cars and trains have wheels in common, and they have a certain general appearance of artificial objects. They have a regular shape, a certain metallic appearance, so the computer can transfer these elements when it tries to learn about trains,” he outlines. This base of knowledge helps the computer

to learn with less supervision and guidance than is currently required. “Technically speaking the typical level of supervision – normally called full supervision – corresponds to the degree of supervision that matches the level of output you want on a new image,” says Dr Ferrari. “For instance, if you want the computer to learn a technique that is able to localise cars, by putting a rectangle around the car, then you should first train the computer on other cars that also have a rectangle around the car made by a human.” The computer can learn about the car from these annotations, which will improve its ability to interpret images containing cars. While full supervision is a wellestablished method in computer vision, weak supervision is a major area of interest for Dr Ferrari and his colleagues. “There is definitely still a level of supervision in our

37


Illustration of our scheme for knowledge transfer on ImageNet. First, E-SVMs classifiers are trained for each object exemplar in the source set. Next, these classifiers are used to represent all windows in both source and target sets, which are then embedded into a low dimensional space (coined Associative Embedding, AE). After adding location and scale cues to the AE representation of windows, Gaussian Process regression is used to transfer annotations about overlap with the object from source windows to target ones. The final score assigned to a target window integrates the mean predicted overlap and the uncertainty of the prediction. approach, I don’t believe in unsupervised learning,” he stresses. Weak supervision might mean that a car is no longer framed within an image by a rectangle, but there is still some level of guidance to help the computer identify it. “You might have something weaker, such as the knowledge that the car is in the image somewhere,” says Dr Ferrari. “This is more than no supervision. No supervision would mean you just have a lot of images, some of which have the car and some of them don’t, and the computer just tries to learn from that. That is very difficult. So if you just know at least which images have a car in them, but you don’t know where they are, then that makes it easier. That’s what we typically call weak supervision.” This research builds

38

on the WordNet database, a lexical database that organises all the words in the English language into a semantic hierarchy. It functions almost as a tree, with sub-branches for related words and concepts. “So you might have, in a series: ‘a car, which is a transportation device, which is an artefact. In another branch you might have animals at the top node, then mammals, followed by quadraped, going right down to a particular breed of dog,” outlines Dr Ferrari. This has been extended by computer vision researchers into imagenet, essentially the wordnet hierarchy with images attached to each node. “We use this hierarchy as a map to know where to transfer knowledge. So, typically, the images in the imagenet database contain some annotations. The database contains some images that have detailed annotations, such as a rectangle around each object. And it contains other images that don’t,” says Dr Ferrari. “This is a perfect playground for my project – we use these visual concepts, which are normally organised into object classes.”

Researchers are using those classes with annotations to build models which then transfer knowledge to those classes which aren’t supervised. This transfer is guided by semantic relations. “So we are going to transfer knowledge from a breed of dog to another breed of dog, not for example from a certain dog to a car. We’re also able to transfer information across semantically related concepts – this works really well,” explains Dr Ferrari. The effects of transferring knowledge and incremental learning don’t really appear until the total images number in the hundreds of thousands though, or even millions; Dr Ferrari says that extensive computational resources are required to pursue research on this scale. “In order to work at that scale, we need a good infrastructure. These days you need a mixture of central processing units (CPUs) and graphic processing units (GPUs) – research has moved towards GPUs due to the advent of neural networks for computer vision, which are really very important in our research,” he outlines. This is something of an accident of history, as GPUs were originally developed for computer graphics before their potential in machine learning became apparent. Dr Ferrari and his colleagues are using GPUs to train and develop machine learning methods that are built on convolutions, a type of linear filter which helps represent a feature from the original image. “Convolutions are the basis of convolutional neural networks,” he says. These convolutional neural networks are a type of artifical neural network that requires the computation of millions of multiplications on every image, every second, which is why GPUs are very effective in this area. “There are a series of very small processors inside the GPU – typically there are 200-300

EU Research


cores. They do certain things very well – in particular GPUs are very good at matrix multiplication, the fundamental operation that GPUs do super-fast. It is common to both convolutions – which are necessary for neural networks – and projections, which are required for graphics.”

Supervisory signals The project is also investigating the area between full and weak supervision. Researchers are investigating the use of eye-tracking data as a supervisory signal, to help a computer learn about a particular class of image. “If you ask an individual to look at a horse, normally they look at it and then move on to the next image. From the pattern of their eye movement you can roughly find out where the horse is in the image,” explains Dr Ferrari. This can then be used as a supervisory signal for the computer to learn about horses, as that is the focus of the eye movement, helping it differentiate the horse from other elements of the image. “The real problem in visual learning is that it’s not clear to the computer

says Dr Ferrari. “The computer can choose to learn from the ten easiest clusters of information, which maybe have more data, or more supervision. The computer learns from that and automatically chooses to go to the next, most beneficial object class.” Dr Ferrari and his team have successfully transferred knowledge across the imagenet hierarchy, from high supervision down to lower supervision, while there have also been positive results on the use of eyetracking data as a supervisory signal. Dr Ferrari plans to pursue further research in this area in future, including in the active learning field, where the computer requests assistance from a human when it has trouble localizing an object in an image. “I think the future is where a computer learns for a while, and produces some annotations of its own. It generates its own annotations during weakly supervised learning,” he says. Many of these annotations may be wrong, but if the computer can ask for verification from a human, then it can correct itself and improve its internal models. “So the computer can ask a human;

If you ask an individual to look at a horse, normally they

look

at it and then move on to the next image. From the pattern of their eye movement you can roughly find out where the horse is in the image why those pixels that are not on the target object aren’t on it. So if you just show horse images to a computer, it might learn that a grass field is a horse for example,” continues Dr Ferrari. “So a big fight in visual learning, when it’s weakly supervised, is to try to figure out where the object is before you even know how it looks.” This is something of a chicken-and-egg type problem. Introducing eye-tracking data is a way of solving the issue, giving the computer the minimum amount of guidance required to achieve the desired result. “From the eye tracker you know there was a horse somewhere along this scan path. So it’s all about having this fine level of guidance,” outlines Dr Ferrari. The eventual goal is to establish an automatic lifelong learning loop, requiring less supervision. “We will try to pull together all the images we have into a single framework, and try to get the machine to go into an automatic lifelong learning loop. This means training the computer and introducing a lot of data with mixed supervision. Some images will have a rectangle around the object, some a label, some will have eye-tracking data,”

www.euresearcher.com

‘Look at these 100 images of a horse that I think I localised correctly – are they really where I think they are?’” explains Dr Ferrari. “And the human can say; ‘well, actually here you were wrong.’ That’s actually a very strong supervisory signal, according to the latest research.” This is a much more efficient approach than asking a general question at the beginning, before the learning loop starts. There’s less need for human intervention and the machine asks questions itself; this also has significant cost implications. “It takes about 2-3 minutes for a person to draw a precise outline around a horse. You can imagine that doing that on a million images will be very expensive,” points out Dr Ferrari. By contrast, the computer could get a strong supervisory signal by asking a person for verification, which takes only a few seconds. “This enables the computer to learn with much less human effort. In the end, that’s the ultimate goal of all weakly supervised learning projects – to try to get the computer to do most of the job itself, and to minimise the need for human intervention,” says Dr Ferrari.

At a glance Full Project Title Visual Culture for Image Understanding (VisCul) Project Objectives The VisCul project is about lifelong learning in Computer Vision. Traditionally, Computer Vision methods learn each visual concept separately, starting from scratch every time. The objective of the project is to break from this tradition by developing methods to continuously learn visual concepts on top of everything that was learned before. This will bring computers closer to how humans learn. Project Funding 1.48M Euro Contact Details Principal Investigator, Dr Vittorio Ferrari IPAB, School of Informatics Crichton Street 10 Edinburgh EH8 9AB, UK T: + 0131 650 2697 E: vferrari@staffmail.ed.ac .uk W: http://calvin.inf.ed.ac.uk A. Vezhnevets and V. Ferrari Associative embeddings for large-scale knowledge transfer with self-assessment IEEE Computer Vision and Pattern Recognition (CVPR), Columbus, June 2014. D. P. Papadopoulos, A. D. F. Clarke, F. Keller and V. Ferrari Training object class detectors from eye tracking data European Conference on Computer Vision (ECCV), Zurich, Switzerland, September 2014.

Dr Vittorio Ferrari

Principal Investigator

Vittorio Ferrari is a Reader at the University of Edinburgh. He received his PhD from ETHZ in 2004. After post-docs at INRIA Grenoble and the University of Oxford, he returned to ETHZ as Assistant Professor in 2008-2012. He received several awards including an SNSF Professorship, the ERC Starting Grant, and the best paper award from the European Conference in Computer Vision.

39


Crack propagation is the most common cause of material failure, now researchers are using innovative real-time measurements to gain new insights into the singular fields found at the tips of propagating cracks. This research holds real importance to our understanding of both material strength and earthquake dynamics, as Professor Jay Fineberg explains

Where friction meets fracture The vast majority

of homogenous material failures are caused by crack propagation, where a defect develops and eventually propagates when sufficient stress is applied. The space-time dynamics that lead to the failure of both bulk materials and frictional interfaces is an area of great interest to Professor Jay Fineberg, coordinator of the FractFrict project. “A crack will propagate at nearly the speed of sound, which is the speed at which information can propagate in a material,” he explains. The project is investigating the dynamics of near-tip singular fields, the putatively infinitely large stresses that the tips of cracks feel in their near vicinity. “What we’re trying to understand is how the singular fields develop as a crack starts to speed up, and how a crack then changes. It’s known that cracks can change their behaviour when they’re moving fast enough – how does this come about?” says Professor Fineberg.

Stress focuser This work is built on a fundamental understanding of how cracks behave under stress. A crack acts as a kind of lightning rod when a homogenous stress is applied to a material, focusing the stress at the crack’s tip. “If you have a crack, at the tip of the crack you have nearly infinite stresses as long as you’re applying some overall stress to the material itself,” explains Professor Fineberg. The stresses at the tip of a mathematically sharp crack will become infinite, which causes material bonds to break and makes the crack propagate. “At a certain value or a certain speed a crack stops wanting to be a single crack and becomes multiple cracks, or multiple microscopic cracks. This is one of the things we are now starting to understand – how this comes about and how this can be controlled,” continues Professor Fineberg. “We’re investigating that by effectively slowing down the speed of sound in the materials we’re using.”

40

The Transition from Static to Dynamic Friction: Friction is Fracture Laboratory earthquakes enable the transition to frictional sliding. Frictional sliding only takes place when the interface that separates two sheared bodies “fractures”. This occurs via an earthquake that propagates within the interface; a crack-like rupture front that fractures the contacts forming the interface that moves at nearly the speed of sound. Only when this front traverses the entire contact surface does any overall “macroscopic” motion of the bodies take place. The speed of sound in materials like window glass is measured in the thousands of metres a second. A type of aqueous gel is being used to slow this down considerably, allowing Professor Fineberg and his colleagues to observe crack propagation as it takes place in much greater detail than previously possible. “These are very soft materials but very fragile, and you can play with the chemical content to change their characteristics,” he outlines. Techniques like high-speed photography can then be used to gather information about the crack, while researchers are also able to study their singularities and regularisation modes. “A singularity is a putative mathematical infinity at the tip of a crack.

But nature doesn’t believe in infinities, so nature somehow has to blunt this infinite stress,” explains Professor Fineberg. “The blunting of the singularities takes place very near the tip of the crack, so at very small scales – and usually at very high velocities in standard materials.” Researchers can now observe how nature blunts this putatively infinite stress at the tip of a crack. The ability of a material to deal with cracks is what ultimately governs its strength and its range of potential applications, underlining the wider relevance of the project’s research. “Cracks in sheet glass and titanium for example have very similar behaviours as they approach the tip, and what occurs near by

Sequence of photographs of the tip of a crack moving at approximately 50% of the material’s sound speed, as the crack undergoes a microscopic branching instability. The eventual length of the microscopic branch formed is a few 10’s of microns, yet, as can be seen in this sequence, its formation has a enormous effect on the crack’s dynamics.

EU Research


the tip to change or negate this infinite stress is what will tell you whether a material is fragile or not,” says Professor Fineberg. The project is currently studying the precise mechanisms involved in this process. “We get information about the changes of these singular fields as the crack forms them at very high spatial resolution. We do it through a combination of measuring the distortion of microscopic grids imprinted on the material and using high-speed photography as the crack goes into them,” outlines Professor Fineberg. The project’s work has helped to significantly extend the theory of fracture, describing how macroscopically imposed stresses are translated to these very large stresses at microscopic scales. If these very strong stress fields can be prevented from developing, it will be possible to significantly increase the macroscopic strength of materials. “We’re currently researching how to regularise or blunt these singular stresses,” says Professor Fineberg. The project’s results may be widely applicable, enabling the design of materials with strength characteristics tailored to their intended

pressures, then you’ll find that you’re increasing the real number of contacting points,” continues Professor Fineberg. “Even under huge pressures, the real points of contact are still very, very sparse though.” The process of breaking this sparse plane of contacts is actually a process of fracture. “They don’t all break at the same time, but there’s a well-defined crack that moves along this sparse interface, and breaks the contacts one after another,” says Professor Fineberg. This is very similar to the process by which cracks propagate in bulk materials, determining material strength. “The same types of singularities that cause the fracture of bulk material are at the tip of a crack that’s now propagating across this frictional interface, and breaking these contacts,” explains Professor Fineberg. “So if you take two blocks of material, and you cause them to slide upon one another – what you’re effectively doing along that interface that separates the two materials is creating a small earthquake. That earthquake is breaking the contacts that lock the materials together, which we call frictional forces. In this sense, friction is actually a fracture process.”

We published a paper in Nature in which we found that the mathematics that describe earthquakes are virtually identical to that which describe the cracks that propagate under imposed shear forces application. “We hope that, with more fundamental knowledge of the building blocks that determine material strength, this will help to design stronger or weaker materials as required. We aim to effectively control the strength of materials by affecting what’s going on at the tip of a crack,” outlines Professor Fineberg.

Earthquake dynamics Researchers are also addressing fundamental questions about what causes the transition from static to dynamic friction, work which has significant implications for our understanding of earthquake dynamics. This is an area in which we still lack fundamental knowledge; Professor Fineberg says that the real points of contact between two contacting planes, such as a tectonic plate or another frictional interface, are orders of magnitude smaller than their nominal area. “They form a weak interface at the plane of contact,” he explains. It is the strength of these contacting points that actually dictates the frictional strength along the interface. “If these two planes are pushed together, for example under tectonic

www.euresearcher.com

Researchers can now visualise the real area of contact along a long frictional interface, observing contacts as they break in real time. This process again occurs very fast. “A crack moves along a frictional interface at sound speeds of the material, which in the materials we use is between 1,000 and 2,000 metres a second,” outlines Professor Fineberg. At the same time, researchers can measure the stresses at the tips of these cracks – data from which Professor Fineberg has been able to draw new insights. “We published a paper in Nature in which we found that the mathematics that describe earthquakes are virtually identical to those which describe the cracks that propagate under imposed shear forces,” he continues. “We now have a very good fundamental understanding of what I would describe as miniature earthquakes. We can create earthquakes in the lab and watch them unfold - this has helped us create a new paradigm of what friction really is. These same insights are providing fundamental new understanding of the physics of earthquake dynamics.”

At a glance Full Project Title Fracture and Friction: Rapid Dynamics of Material Failure (FractFrict) Project Objectives FRICTFRACT is a comprehensive study of the space-time dynamics that lead to the failure of both bulk materials and frictionally bound interfaces. Failure in both is precipitated by rapidly moving singular fields at the tips of propagating cracks or crack-like fronts that cause material damage at microscopic scales. The primary objective of the proposed research is to achieve a fundamental understanding of the interrelation between the fields that drive microscopic damage and the resultant macroscopic modes of failure. Project Funding 1. The European Research Council Grant No. 267256 2. The James S. McDonnell Fund Grant No. 220020221 3. The Israel Science Foundation Grant No. 76/11 Contact Details Project Coordinator, Professor Jay Fineberg The Racah Institute of Physics, The Hebrew University of Jerusalem, Jerusalem 91904, Israel T: +972 2 6585207 E: jay@mail.huji.ac.il W: http://www.phys.huji.ac.il/~jay/

Professor Jay Fineberg

Professor Jay Fineberg leads his experimental physics group at the Hebrew University. The recipient of a number of prizes and honors, Prof. Fineberg has been at the Hebrew University since 1992, since completing his post-doctoral work at the University of Texas, Austin. His experimental work has made numerous contributions to our fundamental knowledge of driven nonlinear systems together with both the physics of fracture and frictional motion.

41


Deep insights into the carbon cycle The primary objective of the Carbonsink project is to resolve the fate of the carbon deposited in marine sediments and oceanic crust. We spoke to project coordinator Dr Sasha Turchyn about their research into how the removal of carbon from the earth’s surface is regulated over geological timescales The geological carbon cycle operates over extended timescales, with carbon on the surface of the Earth for around 200,000 years. It can exist as carbon dioxide, dissolved in water, in plants or animals, before eventually being removed into sediments and becoming rock. Carbon can remain as rock for hundreds of millions of years before eventually returning to the surface carbon pool. The underlying mechanisms of this cycle are the primary research focus of the Carbonsink project. “The project is looking at how the carbon cycle functions over geological timescales, so over tens of millions of years,” says Dr Sasha Turchyn, the project’s coordinator. The amount of carbon at the surface of the earth, in the atmosphere, oceans, soils and biological systems, has varied significantly over these timescales. It is thought that there is currently a total of about 40,000 Gigatonnes of carbon at the surface of the earth; around 750 Gigatonnes of that is in the atmosphere, which has a major influence on the temperature of the planet. “Over long periods of time the quantity of carbon in various surface reservoirs is not fixed. In fact, the amount of carbon in the atmosphere was 30 percent smaller than pre-industrial levels

CarbonSink The Subsurface Sink of Carbon in the Marine Environment Project Coordinator: Dr Sasha Turchyn, Department of Earth Sciences Downing Street University of Cambridge Cambridge CB2 3EQ T: +44 1223 333479 E: avt25@cam.ac.uk

Dr Sasha Turchyn got her PhD from Harvard in 2005 and came to the Department of Earth Sciences at the University of Cambridge in 2009 after a postdoc at the Miller Institute for Basic Research at the University of California, Berkeley and a short career break. Since arriving at Cambridge, she has built the Laboratory for Marine Biogeochemistry in the department. Her work spans the interface between isotope geochemistry, paleooceanography and geomicrobiology.

42

around 20,000 years ago because the oceans contained more carbon,” explains Dr Turchyn.

Marine sediments Researchers are looking at how carbon at the surface is removed into the minerals found in marine sediments. Microbes and bacteria play a significant role in helping drive the formation of carbon-containing minerals in sediments. “My hypothesis was that bacteria that live within sediments are helping to generate minerals that have calcium and carbon in them – limestone basically – and that the activity of that bacteria is helping to remove carbon from the surface,” continues

which different elements were diffusing into and out of the sea floor.” Researchers are also growing bacteria in the lab to investigate the link between the metabolism of the bacteria and the formation of this mineral in ocean sediments. One area of interest is speeding up the growth of the bacteria, and assessing the impact on mineral formation. “We’re trying to understand it over natural timescales; when you pump carbon dioxide into sediments, do you stimulate bacteria to grow faster? Do the bacteria then do something else? How does that affect the natural processes within the sediments?” outlines Dr Turchyn.

My hypothesis was that bacteria that live within sediments are helping to generate minerals that have calcium and carbon in them – limestone basically – and that the activity of that bacteria is helping to remove carbon from the surface Dr Turchyn. “Rather than try to quantify that flux by looking at carbon, I wanted to quantify that flux by looking at calcium.” The main tool being used is a very specific chemical property of calcium, related to its isotope ratio. From the information gathered about calcium fluxes, researchers can then draw inferences about the volume of carbon. “If you take mud, even from as deep as 100 metres below the sea floor, and spin it down in a centrifuge, the fluid between the grains will come to the top of the centrifuge and the mud will go to the bottom,” explains Dr Turchyn. “From the measurements of those fluids, we were able to calculate the rate at

Climate change This approach of pumping carbon back into sediments has been proposed as a way of mitigating anthropogenic climate change. However, the long-term impact of these kinds of measures is not fully understood; Dr Turchyn plans to pursue further research in this area. “We’re continuing to try and investigate how bacterial metabolism, or the rate of activity in the deep biosphere, relates to the amount of carbonate minerals that are produced. Over the next three years we’ll explore the importance of this process over the course of the earth’s history,” she outlines.

Reprinted from Sun and Turchyn, Nature Geoscience 2014. A map of the flux of calcium within marine sediments created through spatial interpolation among pore fluid profiles collected through the various Ocean Drilling Programs.

EU Research


A window into the New Kingdom Sai Island, located between the second and third cataracts of the Nile in modern-day Sudan, was occupied by Egypt during the period of the Egyptian New Kingdom. Artefacts and archaeological remains on the island will offer new insights into settlement patterns in Upper Nubia during Pharaonic times, as Professor Julia Budka of the AcrossBorders project explains The towns and

settlements of Upper Nubia, located between what is today northern Sudan and southern Egypt, have a rich cultural heritage, evidence of the diverse influences that have shaped the area over the course of its history. However, relatively little is known about the architecture, social structures and material culture of towns in the area during the ‘New Kingdom’ era, roughly between the 16th and 11th centuries BC, when Egypt occupied much of Nubia; this is a period of great interest to Professor Julia Budka, Principal Investigator of the AcrossBorders project. “The aim of the project is to get a better understanding of settlement patterns in Upper Nubia during Pharaonic times, particularly the so-called New Kingdom,” she outlines. Researchers are looking at material and artefacts from three major settlement sites. “Our key site is Sai Island in northern Sudan. We also work at two sites in Egypt – Abydos and Elephantine,” says Professor Budka.

www.euresearcher.com

Sai Island A large island located between the second and third cataracts of the Nile, Sai has a long history of occupation, dating from palaeolithic times right through to the present day. Each successive generation of settlers has left its mark on the island. “Several Nubian cultures occupied the island in pre-historic times, then the Egyptians arrived in the second millennium. The New Kingdom

sites in Egypt, the Pharaonic remains have not been built over in modern times, which is really good for us, so we can work there without any problems” explains Professor Budka. The project’s primary aim is to assess whether the remains found on Sai Island allow researchers to judge the settlement as a typical Egyptian settlement of the time, in comparison with those at Elephantine and Abydos. While Egypt

There was an indigenous presence of ancient Nubians in the town, and also of course Egyptians lived there too. So we are really dealing with a very complex microcosm occupation was followed by the Post-New Kingdom. One of the heydays of Sai was during the Ottoman period – a large fortress was erected on the island during that time,” says Professor Budka. The Pharaonic settlements established on the island were quite small in scale, and while not all of the archaeological remains are well preserved, the site offers some important benefits. “Unlike most

occupied Upper Nubia at the time of the New Kingdom, the population was in fact quite diverse. “Although the town was built by the Egyptians and of course Egyptians lived there, there was also an indigenous presence of ancient Nubians in the settlement. So we are really dealing with a very complex microcosm,” stresses Professor Budka. The indigenous population lived together with the

AcrossBorders team on Sai Island, 2015 (Photo: Julia Budka).

43


Various steps of work of the AcrossBorders’s project: J. Budka in the field (Photo: Nathalie Bozet); H. Magzoub drawing pottery (Photo: Nathalie Bozet); K. Griffin registering objects (Photo: Julia Budka). Egyptian settlers during the time of the New Kingdom, and played a full part in cultural exchange. “Now that we are investigating the archaeological remains and town life it’s obvious that in simple, daily things like cooking ware there is also a strong Nubian influence,” continues Professor Budka. These findings run contrary to earlier thinking, which saw firstly a clear boundary between the Egyptians and Nubians, and secondly saw Egyptian culture as far more advanced and sophisticated than the Nubian. Professor Budka believes the true picture is actually far more complex, and that the two cultures shared ideas and technologies with each other. “The indigenous peoples really played an important role for the Egyptians, and I’m also convinced that Egyptian settlers married Nubian women. The question is whether all of this is reflected in the archaeological remains,” she says. The project’s working hypothesis is that Sai Island during the time of the New Kingdom was really an Egyptian microcosm. “Egyptians were the main occupiers, the main population, and clear parallels can be drawn with other towns in the country, but Sai Island also has certain peculiarities,” says Professor Budka. “These peculiarities are because we have this mix of Egyptian and Nubian culture. It’s a local version of an Egyptian town.”

44

Living standards Researchers are investigating artefacts found on Sai Island to build a more detailed picture of these peculiarities and the lifestyle of its inhabitants at the time of the New Kingdom. Professor Budka is particularly interested in social interactions and the people who lived there. “Were the occupants Egyptians, Nubians, or a mixture of both? How did the Egyptians

it with the other two sites. One major area of interest is the pottery artefacts found on Sai. “The corpus is very large. Of course we have completely ordinary daily wares and shapes – like cooking pots, serving dishes and large storage vessels – but you also find luxury stuff. We have found imported materials from Cyprus, Mycenea, and the Levante,” says Professor Budka. It is possible to date these artefacts quite precisely, from

The general aim of the project is to get a better understanding of settlement patterns in Upper Nubia during Pharaonic times, particularly the

so-called New Kingdom

interact with their neighbours, with the indigenous Nubians? What about the women on the island in antiquity? Did women come from Egypt along with their Egyptian husbands, or did the Egpytians inter-marry with Nubians?” she asks. One of the important aspects of the project’s work on Sai island is that it enables Professor Budka and her colleagues to go one step further and compare an Egyptian site, in the Nubian context, with Egyptian sites in Egypt proper. “From this I would really expect a lot of information about the Pharaonic lifestyle in the New Kingdom,” she enthuses. A key part of the project’s work is analysing the material culture, archaeology and architecture of Sai, then comparing

which researchers can draw wider inferences about the lifestyles of people living on the island. “The settlement remains that we’ve found from Nubia are much better preserved than other settlement sites, and we can compare these with the Egyptian examples. This really allows us to better understand how social structures worked and also draw new insights into people’s lifestyles,” outlines Professor Budka. The type of funeral an individual is afforded at the end of their life is a major signifier of their status within society, and also the social structure of the time. The project is excavating funeral remains at the most important New

EU Research


At a glance Full Project Title Across ancient borders and cultures: An Egyptian microcosm in Sudan during the 2nd millennium BC (AcrossBorders)

View of some archaeological remains of the New Kingdom on Sai Island (Photo: Julia Budka). Kingdom cemetery on Sai Island, which Professor Budka says was primarily used as burial site for people from the medium and upper echelons of society. “It’s clear that this is where medium- and high-ranked people were buried. At the moment we do not have any information on where lower-ranked people have their tombs,” she outlines. The burial sites show both Nubian and Egyptian influences. “At the beginning of the eighteenth dynasty, there was a mixture – the tomb architecture was Nubian in style, but the tomb goods and burial equipment were a mixture between Nubian and Egyptian,” outlines Professor Budka. “From the mid-eighteenth dynasty onwards, which we think was the heyday of Egyptian occupation on Sai, the tomb architecture was very much Egyptian in nature, as were the burial goods and the equipment.” The origins of the people who were buried at this cemetery are not entirely clear. While their names and titles are completely Egyptian, Professor Budka believes that many were originally born as Nubians, before gradually becoming acculturated. “We will investigate the human remains, because even if a person has an Egyptian name and an Egyptian title, it is by no means sure that they were born in Egypt. There’s a very good possibility that they were born in Nubia, and then raised within this Egyptian microcosm,” she says. This may reflect a gradual process of cultural acceptance; while at the beginning of the

www.euresearcher.com

New Kingdom period the Nubians and Egyptians were enemies, the Nubians quickly adapted and learned to co-exist with the new rulers. “The Egyptians were involved in several wars and campaigns against the Nubians at the beginning of the New Kingdom period. But we think that after the Egyptian victory the negative picture of a vile Nubia changed within two or three generations, and a heyday of Egyptian culture in Nubia developed in which indigenous Nubians were actively involved,” continues Professor Budka. Researchers are currently analysing the samples that have been gathered from the three key sites, and plan to publicise their findings in reports and monographs. Beyond the initial scope of the AcrossBorders project, Professor Budka hopes their research will lead to further collaborations in future. “We have established very good contacts within the project with our French, British and American colleagues. I hope that within the next years we can develop a joint project that will follow-up on the work of AcrossBorders,” she says. Sai Island has worked very well as a case study, but scientists need to extend research to neighbouring sites and the hinterland to build a more complete picture of settlement patterns in Upper Nubia. “We would like the opportunity to widen the perspective of our research. In general I would like to stick with the main questions we are investigating now, but to put them on the next level in future,” says Professor Budka.

Project Objectives Because settlement archaeology in Egypt and Nubia (Northern Sudan) is still in its infancy, AcrossBorders aims at strengthening this field. Three major settlement sites (Abydos, Elephantine and Sai Island), situated across ancient and modern borders and of diverse environmental and cultural preconditions, show very similar archaeological remains. For the first time, the project tries to explain this situation in detail from a multi-perspective and with the application of various methods. Project Funding 1 497 460 euros Project Partners • Full details on website Contact Details Project Coordinator, Professor Julia Budka LMU Munich Institut für Ägyptologie und Koptologie Katharina-von-Bora-Str. 10 D-80333 München T: +49 (0) 89 / 289 27543 E: Julia.Budka@lmu.de W: http://acrossborders.oeaw.ac.at/ Julia Budka, The New Kingdom in Nubia: New results from current excavations on Sai Island, in: Egitto e Vicino Oriente 37, 2014, 55–87. Julia Budka, The Egyptian “Re-conquest of Nubia” in the New Kingdom – Some Thoughts on the Legitimization of Pharaonic Power in the South, in: Royal versus Divine Authority, ed. by F. Coppens, J. Janák & H. Vymazalová, Königtum, Staat und Gesellschaft früher Hochkulturen 4,4, Wiesbaden 2015, 63–82.

Professor Julia Budka

Professor Julia Budka studied Egyptology and Classical Archaeology at the University of Vienna. Already prior than having been awarded with a PhD in 2007, she got a researcher position in Berlin (Humboldt University, 2004-2012). In 2012 she was rewarded with both a FWF START prize and an ERC Starting Grant. Budka was appointed as Professor for Egyptian Archaeology and Art History at LMU Julia Budk a Munich in 2015. ACROSS BORDERS

45


A strong basis for environmental legislation The European Commission recently financed a study to review current air quality legislation and to assess how their public acceptability can influence the effectiveness of future policies. This is a context in which the work of the SEFIRA project takes on real importance, as project coordinator Professor Michela Maione explains The European Commission

depends on accurate, reliable data in drafting and implementing air quality legislation, such as the National Emissions Ceiling Directive. While the effects of air quality on health are of course a prominent concern, legislators also need to consider the likely socio-economic impact of legislation, issues that the SEFIRA project is addressing. “We have quite a diverse consortium in SEFIRA, with competences from the hard sciences, along with social scientists, economists, geographers and others,” says Professor Michela Maione, the project coordinator. The project aims to create trans-disciplinary scientific and socio-economic resources to support the ECs review of existing air quality legislation and the eventual implementation of updated measures; Professor Maione says that the transdisciplinary nature of this work is challenging. “Scientists from different disciplines tend to speak very different languages. So it was not very easy at the beginning to put all these things together, and to start the project,” she explains. “Then we are using discrete choice models, which have never previously been used in this field. They are being used to evaluate the acceptability of air quality policies.” These policies have generally sought to limit the emissions of harmful pollutants. There is a clear distinction here between emissions and the levels of a pollutant that may be present in the atmosphere. “Emissions are the quantity of a pollutant that is emitted by a source over time. Levels are the concentration that you have in the

atmosphere - that’s the consequence of the emissions,” says Professor Maione. The European Union has been actively trying to improve air quality since the 1970s, partly through controlling emissions, yet such policies must be accepted by the public if they are to be effective, an issue which is central to the SEFIRA project’s research. The project’s scope encompasses elements of atmospheric sciences, environmental and legal sociology, anthropology, geography and economics, and researchers are bringing together data gathered from questionnaires and discrete choice models.

Discrete choice models This latter aspect of the project’s research relates to individual choices and decisions rather than the overall economic impact of air quality legislation. While many of us would agree when asked that it’s important to improve air quality, we tend to be a little more circumspect when we are confronted with the practical effects of legislation. “What we do in SEFIRA is to evaluate whether a citizen is likely to accept a policy over another. What trade-offs are they willing to make?” outlines Professor Maione. Existing survey methodologies can gather data on people’s general attitudes towards air quality, but not the changes they’re ready to make to improve it; by contrast, discrete choice models can be used to evaluate the trade-offs people

are willing to make. “With discrete choice models you can ask the respondent how much they are ready to pay to improve the environment. Would they prefer to pay money, or would they prefer to reduce the use of polluting means of transport?” says Professor Maione. “Then of course we also know that production of meat and dairy products are major factors in atmospheric pollution. So would people be willing to reduce their intake of those foods?” The project presents a combination of these choices to citizens in seven countries, whose responses are then evaluated in the discrete choice model. These choices relate not only to financial and lifestyle-related issues, but also wider, more long-term considerations. “Premature death is another attribute – how much do you want to reduce the total number of premature deaths? Then there’s the distribution of the policy cost. For example, would you prefer that people who pollute more pay more? Or do you prefer that everybody pays the same? These are the kinds of attributes that we have in our exercise,” says Professor Maione. This work is built on solid scientific foundations and fundamental research into air quality. “When we started this dialogue with social scientists and economists, we immediately found out that there was the need for a common language,” continues Professor Maione. “The most recent research findings are very useful in building this kind of exercise. So, we base our inputs into the model on the development of these exercises, and use our scientific expertise to disentangle some common misunderstandings.”

With discrete choice models you can ask the respondent how much they are ready to pay to improve the environment. Would they prefer to pay money, or would they prefer to reduce the use of polluting means of transport 46

EU Research


The project is also involved in outreach work, heightening awareness among citizens of recent scientific findings, so that the wider public is well-informed about air quality issues. Several stakeholder meetings have been held, where scientists have explained the scientific basis for air quality legislation to citizens and stakeholders. “The relationship between air quality and climate change is becoming clearer. This is not really well known, and it’s something that we are trying to explain and measure; what is the real awareness of this problem among citizens?” says Professor Maione. While the majority of people are aware of environmental and air quality issues, Professor Maione says they often see industry as being the main cause of atmospheric pollution. “Most of the respondents say that they think industry is primarily responsible for atmospheric pollution in their country. I think that’s an interesting answer – they kind of delegate the responsibility for dealing with atmospheric pollution to other people,” she outlines. “Only a very small percentage quote domestic waste, domestic heating and agriculture, as important factors in atmospheric pollution.” This runs contrary to recent scientific findings, which have shown that domestic heating and biomass burning are in fact among the main factors behind atmospheric pollution in many parts of Europe, reinforcing the importance of the project’s outreach work. Researchers aim to gauge public opinion and awareness of the issues around air quality. “We are trying to explain and also to measure; what is the real level of awareness of this problem among citizens?” says Professor Maione. The project data will be made available once it has been completed and validated, while Professor Maione says there are also plans to bring their findings to the attention of policy-makers. “We are in discussions with the European Commission and the European Parliament about presenting our results to the environmental committee. On top of that, we also want to write more focused policy briefs, that we will translate into all the different languages of the parties represented in the consortium, so that we can also reach policy-makers and administrators from other countries within the consortium,” she explains.

www.euresearcher.com

Air quality legislation

At a glance

This is central to building a common, broadbased approach to air quality legislation among European countries, reflecting the fact that pollution often crosses national and administrative boundaries. Pollution may affect an area quite a long distance from where it was generated, so the European Community has put policies in place to limit total emissions. “The current air quality directive was released in 2008. In 2013, the Director General Environment issued a new policy, what they call an air quality package, with a proposal to improve the 2008 air quality directive that was approved by the parliament in 2008, but the process was halted. So the National Emissions Ceiling Directive is still under discussion,” explains Professor Maione. The European directive is also linked to wider conventions, in particular the 1999 Gothenburg protocol, which set emissions ceilings for several chemical compounds. “The main atmospheric circulation is from West towards East, so Europe is under the influence of pollution from the North American continent,” says Professor Maione. The direct cost to society of air pollution is estimated at around 23 billion euros a year, underlining both the economic and social importance of improving air quality. However, legislation must be built on a solid understanding of people’s attitudes if it is to achieve the desired impact, and the work of the SEFIRA project will have a crucial role to play in underpinning future policy decisions. “The discrete choice models are being used to evaluate the acceptability of policies in the field of air quality,” says Dr Maione. Issues around air quality will of course persist beyond the term of the SEFIRA project, and Professor Maione plans to pursue further collaborative research in this area in future. “The EC has now issued a call, in Horizon 2020, for the provision of tools to administrators that will improve air quality and reduce greenhouse gas emissions in cities, and we have been in touch with other consortia on some proposals,” she continues. “We hope that we will also have the opportunity to focus on some of these various topics that we have analysed in the SEFIRA project. For example food production, and the links with air quality and climate change.”

Full Project Title Socio Economic Implications For Individual Responses to Air Pollution Policies in EU (Sefira) Project Objectives SEFIRA has the objective of creating a European coordination of transdisciplinary scientific and socio-economic resources in order to support the review and implementation of air quality legislation by the European Commission (EC) led by DG Environment. Individual behaviours and choices have been analysed in a socioeconomic context ranging from the local to the European level. The main fields involved in the action are atmospheric sciences, environmental and legal sociology, anthropology, geography and economics. Project Funding Total requested EU contribution is 998,000 Euro Project Partners Please refer to this link For full partner details, please visit this link; http://www.sefira-project.eu/ad/12-2/ Contact Details Professor Michela Maione Università degli Studi di Urbino “Carlo Bo” DiSBeF - Sezione di Scienze Chimiche Piazza Rinascimento 6 61029 Urbino - Italy T: + 00 390722303316 E: michela.maione@uniurb.it W: www.sefira-project.eu

Professor Michela Maione

Project Coordinator

Michela Maione is associate professor of Environmental Chemistry at UNIURB. Her research activity is focused on the changing composition of the atmosphere and implications on Air Quality and Climate change. She is responsible, in the frame of various international initiatives, for long-term observation programmes of climate altering species, including short-lived climate forcers and of atmospheric pollutants. She has been PI in several EC funded projects and is currently the coordinator of the EU FP7 coordination action SEFIRA (Socio-Economic implications For Individual Responses to Air pollution policies in EU +27).

47


Philosophical Collaboration Research in the physical sciences is often built on collaboration, yet there tends to be a different approach in the arts and humanities, where papers are largely attributed to one individual. Academics at University College London have introduced an experimental module this term for philosophy students, which will place greater emphasis on collaboration

48

EU Research


M

any of us think of philosophy as a sociable subject, in which students debate topics, listen to opposing points of view and refine their arguments. However, while there may be a popular image of philosophers spending enormous amounts of time arguing in smoke-filled rooms, students and academics tend to be lone wolves when it comes to the authorship of research papers. This stands in stark contrast to the physical sciences, where research papers tend to be the result of long-term collaboration between several people, who are often named as co-authors. Partly this may be a reflection of the differing nature of the subject areas, with physics research, to take just one example often built on close coordination between scientists with complementary areas of expertise. The lone nature of much philosophy research still seems strange though when the needs of today’s employers are taken into account. Job advert after job advert stresses the need for an applicant to be a ‘good team player’, ‘well organised’, and have ‘excellent communication skills’, underlining the collaborative nature of the modern workplace. This is not to say that lone investigation precludes a philosophy student from developing those skills, but it might make it more difficult for a graduate to demonstrate their ability to collaborate and work together on projects to an interviewer. While philosophy students gain a grounding in critical thinking, historically there hasn’t been a great deal of emphasis on collaboration.

Experimental module There are small signs this may be changing though, with an academic at University College London (UCL) taking a new approach to teaching this term. Professor Jonathan Wolff, Professor of Philosophy and Dean of Arts and Humanities at UCL, plans to introduce an experimental module for the new term, challenging students to work more collaboratively. “Philosophy students will be put into small groups. Their task will be to produce a collective report on a burning public issue,” he wrote in the Guardian . “Should we permit assisted dying? What should we do about climate change? That sort of thing. They’ll have a term to solve the world’s thorniest problems.” Even our best and brightest might struggle with that of course, yet the really noteworthy point here is that students are being asked to work in groups, and to share responsibility in a collective endeavour. While this may seem an obvious point, it hasn’t always been the case, with much work in the humanities the product of a lone researcher.

This springs in part from the mental image that still persists of philosophy as a highly abstract subject, practised by learned, well-read individuals. It might be said that the nature of the subject itself requires students to lift themselves away from everyday concerns and take a step back, as E.R Emmet commented in Learning to Philosophise. “Their activity is likely to be mental rather than physical, and this activity is likely to arise, not from a practical need to answer certain questions – as with necessity and invention – but from a natural curiosity which requires for its indulgence a measure of freedom from practical preoccupations ,” he wrote. The nature of philosophy often involves asking questions to which there may not be an obvious answer, or indeed an answer at all, at least in the definitive sense. Again, this is different from fields like physics and chemistry in which data has been gained through rigorous investigation, experimentation, and physical observation, leading to the accumulation of knowledge and expertise. On the surface this seems to be quite different to the nature of philosopy research today, which often starts with speculation and the spirit of inquiry, rather than hard data. However, it’s important to note here the roots of philosophy as a term, which encompasses far more subject matter than we associate with the subject today. “The word ‘philosophy was first used by the Greeks to mean the love of knowledge or wisdom. It is open to doubt whether man’s wisdom has increased in the last two thousand years, but there can be no doubt at all about his increase in knowledge,” wrote Emmet. As we learned more about the world around us, individual fields of scientific enquiry were distinguished from philosophy, which today is thought of as encompassing several distinct areas of enquiry, including ethics, metaphysics and the philosophy of language. The research method itself is not unique however, as Karl Popper wrote in The Logic of Scientific Discovery. “There is no method peculiar to philosophy... And yet, I am quite ready to admit that there is a method which might be described as ‘the one method of philosophy’. But it is not characteristic of philosophy alone; it is, rather, the one method of all rational discussion, and therefore of the natural sciences as well as philosophy,” he wrote. So while we have acquired knowledge about the fundamental nature of matter, the periodic table, and biological systems, we are still no closer to answering questions around the ultimate nature of truth, the extent of free will, and the essence of beauty. That doesn’t mean we have stopped asking them though, and philosophers today continue to ponder seemingly intractable questions.

You will not learn from me philosophy, but how to philosophise, not thoughts to repeat, but how to think. Think for yourselves, enquire for yourselves, stand on your own feet (Immanuel Kant)

www.euresearcher.com

49


This research is not necessarily going to produce a set of clear answers to those questions, but many argue that the journey itself is intrinsically worthwhile, and inculcates habits of mind that will stand philosopher students in good stead for their future careers. The German philosopher Immanuel Kant once told his pupils that; “You will not learn from me philosophy, but how to philosophise, not thoughts to repeat, but how to think. Think for yourselves, enquire for yourselves, stand on your own feet,” he said. This is central to the pursuit of research and an individual’s capacity for original thinking, with many philosophers hoping to find a new perspective on specific topics, or to dig deeper into established lines of thinking, while still working in line with the fundamental principles of logic and reasoning as set down by earlier generations. The nature and extent of philosophy research was described by Emmet; “We shall hope to sort out and tidy up some problems and discover the kind of question that it makes sense to ask and the kind of answer that we can expect to get; we shall hope to discover something about the nature and the degree of the certainty that is attainable. And we shall hope to end up with more knowledge, more wisdom and a clearer understanding.”

50

The scope of philosophy study tends to be quite broad. Philosophy students today still study well-established topics like metaphysics, philosophy of mind and 19th and 20th century European Philosophy, but alongside this students at UCL will also consider major public issues, such as those described by Professor Wolff. While questions around assisted dying, to take just one example, have no clear answer, it’s certainly fertile ground for philosophical debate, and demonstrates the enduring relevance of the topic. Questions around the nature of truth, or free will, are well-known areas of study, but philosophy is also central to our understanding of modern dilemmas. There is already an extensive philosophical literature on assisted dying, reflecting its prominence and the depth of public concern around the issue. The UK parliament recently rejected plans to allow terminally ill patients to end their lives with medical supervision, with many MPs concerned that changing the law would leave vulnerable people without adequate safeguards. On the other side, MPs argued that social attitudes had changed, and that the existing law left many people enduring an unnecessarily painful and protracted death. The prospect of this has in the past led many people to travel abroad to the Dignitas clinic in Switzerland to end their lives, yet Parliament still came out against changing the law.

EU Research


This is very much an international debate, and several assisted suicide cases have come before the US Supreme Court over the last few years. While physician-assisted suicide is legal in four American states, and legislation is being considered in others, debate continues about its ethical, legal and practical implications. A group of six moral philosophers aimed to answer some of the major questions that arise around assisted suicide in The Philosophers’ Brief, a task which they approached in two key steps. “First, it defines a very general moral and constitutional principle – that every competent person has the right to make momentous personal decisions which invoke fundamental religious or philosophical convictions about life’s value for himself,” they wrote. “Second, it recognizes that people may make such momentous decisions impulsively or out of emotional depression, when their act does not reflect their enduring convictions; and it therefore allows that in some circumstances a state has the constitutional power to override that right in order to protect citizens from mistaken but irrevocable acts of self-destruction.”

Collaborative work These are issues that students at UCL will grapple with in the course of their studies, and the collaborative element

www.euresearcher.com

will add an extra dimension, helping preparing students for the workplace. The university experience is not purely about preparing students for the workplace, but with fees set to rise further, students are increasingly focused on the end-result of their studies. Philosophy graduates bring many other qualities to an employer, well away from the question of whether they’re good foot-soldiers that will dutifully carry out orders. The capacity to take responsibility for a project and follow it through shows a high level of initiative, intellectual rigour and commitment to the task, all attributes valuable to an employer. The ability to persuade, to negotiate, to think logically and to solve problems are also highly valued by employers. Nevertheless, the modern workplace almost invariably involves some level of team-working and collaboration, and Professor Wolff believes its important to give students a grounding in this type of work. “For most people this course is simply a rehearsal for working life,” he writes. “If you can’t work in a team you’d better be brilliant, already rich, or not worried about being poor. Or an academic in the arts and humanities.”

www.ucl.ac.uk

51


New theory for next generation physics The Large Hadron Collider (LHC) has recently been re-started, with experiments at higher energies promising to generate new insights into the interactions of fundamental particles. The MC@NNLO project aims to establish a new level of theoretical precision to describe the data generated by particle collider experiments, as Professor Nigel Glover explains The recent re-start of the Large Hadron Collider (LHC) promises to generate exciting insights into particle interactions, with the energy of the proton beams being increased to record levels in the next few years. Alongside gathering experimental data, researchers are also developing theoretical predictions of what will be observed, an area that forms the primary focus of the MCatNNLO project. “We are trying to make more precise theoretical predictions of what will be seen at the LHC,” explains Professor Nigel Glover, the project’s Principal Investigator. Researchers will then compare their theoretical predictions with the precise data that will be obtained at the LHC. “We will either be able to extract measurements of fundamental parameters, like the strong coupling constant, or the coupling of the Higgs boson to other fundamental particles, or we will see that the theoretical predictions are inconsistent with the experimental data,” continues Professor Glover. “In that case we will know that the theoretical model is inadequate, and we will try and improve it. The aims of our research are two-fold. First, we aim to make precision predictions to compare with the data, and then either extract some of the fundamental parameters or extend the theoretical model to incorporate some as yet unknown physics effects.” The standard model of particle physics, a gauged quantum field theory, emerged as a very successful description of nature over the second half of the twentieth century, explaining a wide range of experimental results. It describes three of the four fundamental interactions perfectly. “The standard model describes the strong

52

Display of a proton-proton collision event recorded by ATLAS on 21 May 2015 at a collision energy of 13 TeV. Tracks reconstructed from hits in the inner tracking detector are shown as arcs curving in the solenoidal magnetic field. The green, red and yellow bars indicate energy deposits in the liquid argon and scintillating-tile calorimeters, clustered in a structure typical of a di-jet event. The most energetic jet has a transverse energy of about 45 GeV. nuclear force, the weak nuclear force and electro-magnetism,” outlines Professor Glover. However, it doesn’t describe either gravity or the Dark Matter that is thought to be responsible for the formation of large scale structure in the Universe. “We expect that the standard model will eventually break down, and we know there has to be at least one Dark Matter particle. We don’t know which energy range the Dark Matter particle inhabits, and that’s why we’re going to higher and higher energies to try to find it,” says Professor Glover. “The LHC has just started operating at about twice its previous energy. We expect that the standard model will work in the new energy regime, but we don’t know for sure

that it will and we hope to find evidence of new phenomena.”

Perturbation theory Researchers aim to establish a new standard of theoretical precision to describe the physical observables generated by these experiments at the LHC. One major area of interest is jets, a narrow cone or spray of hadrons and other particles produced in high energy particle collisions; Professor Glover and his colleagues aim to calculate the properties of jets produced at the LHC using a technique called perturbation theory, a method of finding an approximate solution to a problem. “Mathematically, it’s like a Taylor series expansion. You start

EU Research


with a first approximation, improve the estimate with the second term in the series, and then systematically make your prediction more accurate by including third-order, fourth-order and more terms in the series,” he explains. The first and second approximations, the leading order and next-to-leading order predictions, are well understood; researchers now aim to improve the precision of theoretical predictions by including the third term in this series. “At the moment the experiments can measure data with an accuracy of, let’s say, 10 percent, while the theoretical accuracy from next-to-leading order predictions is perhaps 25 percent. We’re trying to make better predictions so that the theory and the experiment have similar levels of precision,” says Professor Glover. The next-to-next-to leading order corrections (NNLO) are very complex, and it’s not currently possible to automate those calculations. The project is

using the standard model, and if the standard model really describes everything in nature, then it should agree with the data,” explains Professor Glover. “If nature is more complex and more interesting, then at some point our theoretical calculations won’t agree with the data. That would be really exciting and we would be challenged to make a more sophisticated theoretical model to describe the new phenomena.”

Experimental data This of course depends on the experimental data and the nature of the results from the LHC. Over the next year or so Professor Glover and his colleagues expect to derive results for some key scattering processes. “We should be able to make a series of new theoretical predictions for jet production, Higgs boson + jet production, and Vector boson+jet production,” he outlines. The next phase will then be to confront the

The aims of our research are two-fold. By making

precision predictions to compare with the experimental data, we will either be able to extract fundamental parameters, or have to extend the theoretical model to incorporate some as yet unknown physics effects providing some of the very first NNLO calculations of jet cross-sections, which Professor Glover says will help researchers make more precise measurements of the strong coupling and the internal structure of the proton. “If the theory doesn’t make a reliable prediction, you don’t learn very much, while if there is too much uncertainty in the experimental data you can’t make a sensible measurement. In the best case, with precise data and precise theory, you can make a precise measurement,” he outlines. This work could also help generate insights into the properties of new particles like the Higgs boson or as yet unobserved phenomena. “We can learn a lot when there’s a deviation between the theoretical model and the experimental data. We make calculations

experimental data with these theoretical calculations, which can then be used to make better measurements of the parameters of the standard model. “Beyond that, we will look towards the second half of the project, where we will work to match our calculations even more closely to experiments, using event simulation techniques,” outlines Professor Glover. “In our perturbative calculations, the number of particles in the final state is very small, typically two, three or four, whereas experimental events typically have hundreds of hadrons in the final state. There’s another layer of theoretical development that we can invoke, in which we simulate the fragmentation of the few particles that we have in the theoretical calculation to produce a very complicated multi-particle final state just like the experimenters see.”

At a glance Full Project Title High Precision Simulation of Particle Collisions at the LHC (MC@NNLO) Project Objectives The overall aim of the project is to establish a new standard of theoretical precision for the description of collider observables. This will be achieved by systematically including the next-to-nextto-leading order (NNLO) perturbative corrections in the relevant simulation tools, in two stages: a) NNLO corrections for benchmark processes - NNLOJET b) Matching NNLO calculations with the parton shower - MC@NNLO Project Funding 1,941,145 euros Project Partners • Durham University, United Kingdom • Universitaet Zuerich, Switzerland • Eidgenoessische Technishe Hochschule Zuerich, Switzerland Contact Details Project Coordinator, Professor Nigel Glover FRS University of Durham Science Laboratories South Rd DURHAM DH1 3LE UNITED KINGDOM T: +44 (0)191 334 3811 E: E.W.N.Glover@durham.ac.uk W: https://www.dur.ac.uk/physics/staff/ profiles/?username=dph0ewng

Professor Nigel Glover FRS

Project Coordinator

Professor Nigel Glover is Professor of Physics at the Department of Physics at Durham University. He is the Principal Investigator of the McatNNLO project, which aims to make more precise predictions for physical observables at the LHC and other particle collider experiments, thereby leading to a more precise extraction of fundamental physics parameters.

©CERN

www.euresearcher.com

53


Smart materials for a sustainable tomorrow The PHELIX project is drawing inspiration from the natural world as researchers seek to develop biological strategies to design and produce smart materials based on the structure of the helix. These materials of the future will adapt to changes in their environment and drive a paradigm shift in our engineering philosophy, as project coordinator Professor Nathalie Katsonis explains Smart materials hold

enormous potential in terms of addressing major social challenges. Complex and stimuliresponsive, smart materials are able to adapt to their environment and undergo purposeful change, which enables them to operate in the optimal way depending on the conditions in which they are being used. Responsiveness and adaptability would bring significant benefits to a wide range of industrial sectors, including aerospace, civil engineering, robotics and consumer goods, in particular by improving energy efficiency and sustainability. However, scientists have struggled to develop smart materials, with most materials still showing moderate versatility and complexity. Researchers in the Phelix project are now tackling the challenge anew, drawing inspiration from biological architectures to develop new smart materials, the properties of which can be changed reversibly by using external stimuli. “We aim to develop sophisticated materials that can save energy, self-heal, adapt to their environment and address a number of major societal challenges, such as sustainability,” says Professor Katsonis, the project’s coordinator. These kinds of smart materials are the norm in the natural world. Many of the architectures commonly found in nature are capable of adapting to their environment and fulfilling different functions. “Typically, if you take the leaf of a tree, it’s self-cleaning and selfrepairing. It’s a material, but at the same time it’s a solar cell,” points out Katsonis. “Replicating these properties in man-made materials would radically change the relationship between mankind and its materials.”

54

Material progress This would represent an important step in the history of materials development. Humanity’s technological progress has always been connected to the materials it makes and uses, from the stone age to the bronze age, and more recently the plastic age and the silicon age. Each new era has brought society to a new level of technical sophistication and material comfort, but the time-span over which materials are used is decreasing and priorities are changing. With the end of the silicon era, there is an urgent need for a more sustainable and efficient engineering paradigm in line with today’s societal and environmental priorities. However, society has long

approach used in Phelix for this reengineering process is based on developing and using photo-responsive liquid crystals, which are doped with photo-controllable molecular motors and switches. “In Phelix we aim to develop smart materials by using design strategies inspired by nature. The main building block I am using is light-responsive liquid crystals. They respond very strongly to very small changes in the environment. So they’re a very good starting point for smart materials,” outlines Katsonis. “Specifically, I like to use photoresponsive liquid crystals, the organization of which is helix-based. Helices are everywhere in biological

In Phelix we aim to develop smart materials by using design strategies inspired by nature. The approach involves materials based on liquid crystals, whose mechanical and optical properties respond to light. Their motion is driven by molecular motors and switches valued immutability, so researchers have mainly developed persistent materials. “The materials we made in the past were typically made to resist in time, to resist changes,” says Katsonis. The priorities in material development have shifted considerably since, and now scientists are working to incorporate energy efficiency, adaptability and sustainability in the overall engineering paradigm. Researchers in the Phelix project aim to address this challenge by dissecting the operating principles of biological systems, determining how the molecules cooperate, and then reengineering the overall system with the available technologies. The main

materials, from DNA to proteins. They also occur at larger scales, where they provide materials with structure, organization and function.” Chirality is a particular focus in the development of these design strategies and principles. A property of asymmetry which means an object cannot be directly mapped to its mirror image, chirality plays a central role in many biological systems. “When you try to draw inspiration from biological construction principles you have to take chirality into account, because it is so important to biological systems,” stresses Katsonis. “Macroscopic natural structures like sea shells, open seed pods and plant tendrils

EU Research


Fig.1 (Left): An azobenzene-based molecular switch is incorporated in the liquid crystal polymer network. Through bio-inspired materials design, its trans to cis isomerization is translated into motion at the macroscale. Fig.2 (Right): The nanostructured ribbons coil like springs in natural light. In UV light, trans-to-cis isomerization of the molecular switch is triggered. These molecular springs are able to lift objects and produce work (3J/mol of azobenzene). are chiral and they have a ‘handedness’, they can be either right-handed or lefthanded.” The generation of chiral shapes is an area of great interest to Katsonis and her colleagues. Many of the macroscopic structures found in nature are chiral, yet Katsonis says some important questions in this area remain unanswered. “It’s not always obvious how nature generates its helices and chiral structures,” she says. Studies have shown that in nature, a wide variety of helical shapes can be generated by a single strip of tissue comprised of mutually opposing molecular layers. The Phelix project is now building further on these findings. “In plants, the helices are also smart. So, when we try to reproduce the architectures found in nature, we generate chiral, helical shapes and we also create smart materials,” continues Katsonis. The scientific challenge here is complex. In order to make stimuli-responsive (smart) materials, researchers need to inculcate them with controllable, dynamic properties. “For that, we make use of molecules, the structure and properties of which are modified under external stimuli (light, temperature, electric field, magnetic field etc),” explains Katsonis. Currently chemists are able to make molecules that show a very specific movement in response to an external stimulus, but this movement is restricted to the scale of individual molecules. This is because these molecules are not able to work together, cooperatively, in the same way as they do in nature and in the human body.

www.euresearcher.com

“Chemists have developed molecular scissors, molecular rotors, molecular switches, molecular cars – a variety of molecules that display a specific movement. This work is tremendously inspiring. But this movement is often restricted to the nanoscale, because that’s the scale at which molecules work,” explains Katsonis.

Cucumber tendrils and the power of movement in plants Researchers are trying to understand how nature uses molecules, and also how it amplifies their motion in order to actuate plant tendrils. This area of research has a long history; Charles Darwin himself was fascinated by the Power of Movements in Plants. “We asked ourselves the question, how do cucumber tendrils wind and unwind?” continues Katsonis.

55


Hybrid smart materials

Left panel : Schematic representation of the approach taken in Task 1. The reactive monomers are shown in red while the dopants are omitted for clarity. Irradiation with UV1 induces photoswitching of the dopants with subsequent modification of the helix. Next, a polymer network is formed by irradiation with UV2. After irradiation is stopped the polymer networks retains memory of the structures formed of out of equilibrium. Right panel: Examples of liquid crystal monomers (2.1,2.2) and photo-initiators (2.3,2.4) which are available for this project. The orientation of fibres within the plant is key to this question. Katsonis says that if the plant is cut in two, leaving two semi-cylinders, then the orientation of the fibres in one semi-cylinder is perpendicular to the orientation in the other one. “In one of these cylinders the fibres that constitute the plant are oriented perpendicular to the other side, at 90 degrees,” she outlines. This means that when humidity swells the fibres and the material expands, then this expansion will occur perpendicularly on each side of the plant. When the expansion on one side of the plant occurs on the parallel plane, and on the other side on the perpendicular, that creates curvature and this is what leads to twisting of the plant. “So by using differential deformation in different parts of the plant, you can create a twist,” explains Katsonis. Researchers have used this principle in their liquid crystal system. “With these liquid crystals, we have created a system where the orientation of the molecules on top is perpendicular to the orientation of the molecules at the bottom,” continues Katsonis. “So when we cut the ribbon

and irradiate them with light, they coil, wind or unwind.” This triggers motion, which can be controlled by irradiation with light. The intensity of irradiation can be adjusted, while the molecular switches that are at the origin of that movement can be engineered chemically. “We can change the chemical structure of those switches, to make them switch faster or slower,” says Katsonis.

In plants, helix-based

motion happens because the orientation of the molecules on top of the plant tissue is perpendicular to the orientation of the molecules at the other side of the plant. That’s what we have mimicked The intensity of UV light and the structure of the molecular switches doping the liquid crystal polymer also have a major influence on the extent of motion. While these are both important factors, Katsonis says there are also other considerations. “The mechanical properties of the polymer are an important issue, because if it is very rigid it will not distort that well,” she outlines.

UV

In the biological world, helix-based actuators are found ubiquitously and perform many key roles in life, from mechanical integrity to energy storage. Inspired by Nature, Katsonis and her group have designed artificial springs that twist or unwind under irradiation with light. These biomimetic springs also demonstrate a breakthrough strategy for translating nano-scale events to the macroscopic scale, and promise to teach us how to harness the movement of molecular motors, switches, ratchets and other machineries, in order to perform useful work.

56

One approach towards smart materials is to take inspiration from biological strategies, and to reimplement them by using available technologies, while there are also alternatives. One possible complementary strategy in the development of bio-inspired and smart materials is using the objects that that supreme engineer mother nature has developed over millions of years, and either using them as building blocks, or re-engineering them into hybrid machines. “In Phelix, we are currently working on organizing plant viruses and protein cages in photo-responsive liquid crystals. In this case, photo-responsive liquid crystals act as dynamic templates – they promote the organization of the protein cages into fingerprint features, or into straight lines,” outlines Katsonis. Protein cages can, in principle, encapsulate a wide variety of cargoes, with special magnetic or optical properties.

When these functional units are organized in a certain way, new properties emerge. “We have shown that by organizing magnetic nanoparticles in long-range and dynamic assemblies of liquid crystals, twisted liquid crystals can be used as efficient anisotropic template nanoparticles and demonstrate the emergence of hybrid soft magnets at room temperature,” says Katsonis. “Another option we are investigating is to attach artificial molecular motors and switches to proteins. Here, the major strategy to achieve efficiency will be to use biological processes to amplify molecular switching in bio-molecular functional architectures. The hybrid materials resulting from this could be injected in the human body, and their involvement in biological processes could be switched on and off by using light as a trigger. The consequences in terms of biotechnological applications would be huge, because it means that we would be able to intervene in such a large number of processes vital to the health of the human body.”

EU Research


At a glance Full Project Title Photo-engineering helices in chiral liquid crystals (Phelix)

UV

Project Objectives The aim of this project is to develop sophisticated helical materials with responsive architectures that are of interest in optical communication, energy management, photonic materials and mechanical actuation.

Superstructure of drops made out of cholesteric liquid crystals doped with photoresponsive molecular switches. Irradiation with UV light (365 nm) reveals hidden chiral information after 24 s.

Soft robotics and vision for the future The potential applications of smart materials are as numerous as they are varied, reflecting the complexity of the natural world from which researchers have drawn inspiration. Biological systems must adapt to their environment, it’s a question of survival. That’s why in nature, smart materials are the norm. “Think about the way a chameleon, a squid or an octopus adjust their skin colour, or how flowers close in dry air to reduce transpiration, or the way the pupil in our eye shrinks or expands depending on the level of light,” says Katsonis. Replicating these and other functions in smart materials could have a significant impact. “The idea is that smart materials are more efficient, more energy-friendly and more compact. These are the kinds of sustainable materials we will need in future,” says Katsonis. The development of these materials is very much a long-term objective, and for the moment researchers within the Phelix project are working towards three specific functions. The first is actuators. “I’m developing photo-actuators – a material that receives light and transforms it into movement, complex movement,” explains Katsonis.

www.euresearcher.com

These materials are again very versatile, and could be beneficial across several industry sectors. One potential application that Katsonis believes would greatly benefit wider society is in soft robotics. “Conventional robotics make use of hard materials, so they can’t be used to manipulate soft matter or living matter. In other words, our robots will have to go a long way before they become humanfriendly, before they can improve prosthetic arms, or before they can help surgeons manipulate parts of the human body.” This work with actuators will form an important part of the ongoing research agenda. In the next step Katsonis is looking to combine the systems that have already been developed with new technologies such as microfluidics, in order to build even more complex systems. “In nature, actuators are made by organized assemblies of unit compartments, typically these compartments I am thinking of are cells. So, what I would like to do is to create universal moving units that reproduce the movements or deformations of plant cells, and organize them together to reach one level up in complexity,” she says. “That’s another area of research that I am looking forward to pursuing in the very near future.”

Key Collaborators • Dr Etienne Brasselet, Mechanical effects of light, CNRS Bordeaux, France. • Professor Ben L. Feringa, Molecular motors and switches, University of Groningen, NL. • Professor Steve Fletcher, Molecular machines, University of Oxford, UK. • Dr Séverine Le Gac, Microfluidics for nanomedicine, University of Twente, NL. Contact Details Project Coordinator, Professor Nathalie Katsonis Bio-inspired and Smart Materials University of Twente (BNT) P.O. Box 217, 7500 AE Enschede The Netherlands T: +00 31 534 89 26 29 E: n.h.katsonis@utwente.nl W: www.katsonis.eu Smart materials: Scientific Reports 2015, 5, 14183; Nature Communications 2015, 6, 7603; Nature 2006, 440, 163. Bio-inspired materials: Nature Chemistry 2014, 6, 229; Org. Biomol. Chem . 2014,12, 4065; Nature 2011, 479, 208.

Professor Nathalie Katsonis

Nathalie gained her PhD in 2004 from Pierre and Marie Curie University (University Paris VI) on the nanoscale probe of 2D molecular self-assemblies. After a postdoctoral stay at the University of Groningen, and a visiting researcher stay at University KU Leuven, she accepted a position as Associate Researcher at the French Center for Scientific Research (CNRS). In 2009 she returned to Groningen as a junior research group leader and eventually joined the University of Twente in 2011. Currently, Nathalie is Associate Professor. Her research group is dedicated to designing and synthesizing bio-inspired and smart materials, with a special focus on soft robotics, chirality at all scales and molecular photoswitches.

57


Building trust in timber While many architects are keen to make greater use of timber in construction, they’re currently held back by its fire protection performance. Darren Atkins and Clive Atkins tell us how the Reactafire project’s work in developing a unique advanced coating system will help improve structural fire protection for timber Many designers and architects are keen to make greater use of timber in buildings, as they look for sustainable and costeffective materials to use in construction. However, timber is a combustible material and so must be adequately protected against the risk of fire if it is to be more widely used, an issue which lies at the core of the Reactafire project. “We’ve set out to satisfy the needs of building designers by developing a coating system that gives them an hour fire protection,” says Clive Atkins, the project’s coordinator and Managing Director at Fire Protection Coatings Ltd (FPCL). While architects are keen to utilise timber to a greater extent, they’re currently held back by its fire protection performance. “At the moment architects swap to steel once a building goes above a certain height, because they don’t trust timber. They would like to have the confidence to trust timber to go higher, but they’re not confident in its fire protection performance,” explains Clive Atkins. This is a centrally important issue in construction. It is estimated that between

2-2.5 million fires are reported in Europe each year, causing between 20-25,000 deaths and costing around 0.2-0.3 percent of national GDP, reinforcing the importance of effective fire protection. “Our research shows that both Germany and the UK have a big problem with arson attacks. This is a concern, because if you lose a building in the construction stage it’s very costly. It affects communities, then there are also insurance problems,” points out Darren

The coating system has to delay the onset of char – in other words create a thermal barrier. The central concept of the chemical formulation is to delay the process of timber and wood charring in a fire, which will help the structure last longer Atkins, Group Sales Manager at FPCL. While timber offers many environmental benefits as a construction material, European regulatory authorities need to be reassured of its safety performance before it can be used more widely. “Before we applied for the grant we identified that some parts of Europe required a product that was capable of achieving one hour fire protection for timber,” says Clive Atkins.

Passive fire protection This provides sufficient time for the authorities to safely evacuate a building in the event of a fire, and also to alert firefighters to come and combat a blaze.

58

Passive fire protection methods, which contain a fire or slow its spread, have an important role to play in these terms. “The Reactafire system will delay the progression of a fire. That’s an important method of passive fire protection,” outlines Clive Atkins. Another key element of passive protection is creating fire barriers within a building, so that a fire can be contained within a single area for as long as possible. “Each storey within a building acts as a

barrier to the fire. This kind of passive protection works hand-in-hand with active fire protection measures,” says Darren Atkins. “When you go into a building you’ll see lots of active fire protection measures, such as fire extinguishers. But passive fire protection is just as important. Passive fire protection can slow the progression of a fire and help contain it.” The Reactafire project aims to develop a system that will improve structural fire resistance for at least one hour, allowing architects to trust a timber structure in the same way that they trust the structure of steel. Protection is currently provided in many timber-framed buildings through the use of non-combustible cladding. “Existing buildings with a timber frame are typically clad with a fireline board. That’s fine – that will give them a period of fire resistance,” outlines Clive Atkins. “The problem is that trades come along and add additional services and cut holes

EU Research


Timber frame fire, Nottingham University, 12th September, 2014. creating penetrations. The minute you have a hole in the boarding system a fire can get through the hole, into the cavity, and a fire can spread very quickly. A cladding system works very well, as long as you can guarantee there are no gaps in it. There have been some major fires in the UK as a result of fire escaping from one compartment into another.”

Coating system Researchers in the Reactafire project are using innovative materials to develop a unique advanced coating system which will enhance structural fire protection for timber, using the latest advances in coatings technology. The chemicals used in the system must have some core properties. “They must have the ability to adhere to timber, and they must also have the ability to seal the surface, so that we don’t get water ingress. Then they have to delay the onset of char – in other words create a thermal barrier. The central concept of the chemical formulation is to delay the process of timber and wood charring in a fire, which will help the structure last longer,” outlines Clive Atkins. This will be done by forming a durable wood char layer, which will provide sufficient insulation to prevent further char damage and increase the time available to evacuate the building. “The coating system is based on new materials used in fire protection,” says Clive Atkins. The system is comprised of several coating layers, including a base coat and a top seal, reflecting the overall complexity of the project’s work. While the primary objective is to develop a system that helps timber survive in a fire for sixty minutes, Darren Atkins says the project is also considering other issues. “We aim to meet various different objectives along with the wider goal of achieving one hour fire protection. This includes things like the toxicity of the system and its appearance,” he outlines. The

www.euresearcher.com

main output of the project will be the base coat, but this will be combined with a topseal to make it ideal for external use. “Another important issue is having a primer and getting the correct adhesion to the substrate,” says Clive Atkins. “We’ve always thought of it as a holistic system and are experimenting with the various layers. We will be keeping the thickness to a minimum because we have to consider the needs of the designers and its ultimate application process.” The feedback from architects so far has been positive, with many keen to use the coating system and make greater use of timber in construction. Researchers are now moving towards the development stage of the project, testing the system to stringent standards and preparing further samples, with the objective of eventually bringing the system to the commercial marketplace. “Our aim is to develop a product that we can market throughout Europe, and eventually worldwide,” outlines Clive Atkins. Timber has long been neglected as a construction material, but with sustainability an increasingly important issue, Clive Atkins predicts that it will be more widely used in future. “The big paint manufacturers have targeted the steel market, which has meant that the timber industry has lagged behind. But timber is becoming a more fashionable product, and therefore that demands that we devote time and effort to it,” he stresses.

At a glance Full Project Title Advanced System for Wood Fire Protection (Reactafire) - Project number: 604842 Project Objectives Developing a unique fire protective system for structural timber and maximising its performance capability. Project Partners Fire Protection Coatings Ltd, UK • JW Ostendorf, Germany • Garvson AB, Sweden • Van Baerle AG, Switzerland • Energenics Europe Ltd, UK • PRA Trading Ltd, UK • SP Sveriges Tekniska Forskningsinstitut AB, Sweden • Ove Arup & Partners International Ltd, UK Contact Details Project Coordinator, Mr Clive Atkins Fire Protection Coatings Limited Unit 1-3 Highway Point 239 Torrington Avenue Coventry, CV4 9AP T: +44 (0) 24 7642 2200 E: fpcltd@btconnect.com W: www.reactafire.eu W: www.fireprotectioncoatings.com

Mr Clive Atkins

Mr Clive Atkins (Project Coordinator) has over 30 years’ experience in the fire protection industry. Clive was the former Chairman of the Association of Specialist Fire Protection (ASFP) and is still a council member. He has been heavily involved in the Fire Safety Development Group (FSDG). Clive has also been Chairman of the Fire Protection Association (FPA) for Coventry and Solihull and was formerly a director at Nullifire Limited, the world renowned manufacturer of intumescent materials. Clive has also been heavily involved in committee work developing industrial guidance, including the ASFP Orange book which relates to the classification of fire retardant coating systems reaction in terms of fire performance.

The Reactafire Team

This project has received funding from the European Union’s Seventh Framework Programme for research, technological development and demonstration (Grant Agreement No: 604842).

59


Unlocking the secrets of self-control There is a clear disciplinary divide in the study of self-control, with philosophers emphasising willpower, while economists look towards external mechanisms by which an individual can commit their future self. The TeamControl project aim to bring elements of several disciplines together to develop a novel account of how we achieve self-control, as Dr Natalie Gold explains Exerting self-control

is often crucial to the pursuit of personal objectives, whether it means dieting in an effort to lose weight, or resisting the urge to go for a drink in favour of studying for an exam. Based at Kings College London, Dr Natalie Gold is the coordinator of the TeamControl project, an initiative which aims to develop and test a novel account of how we achieve self-control. “The main issue in the project is the development of a hypothesis about what it is to be a person, and how that can help you achieve self-control,” she outlines. The project combines approaches from philosophy, psychology and economics to develop an account of selfcontrol that’s consistent with the current psychological literature. “We’re interested in adults in specific situations, where some might be able to exert self-control and some might not. Then the question is: why are some people more successful than others?” asks Dr Gold.

Disciplinary divide There is a clear disciplinary divide in this area of research. Philosophers place great emphasis on willpower in the study of self-control, while the focus in economics and rational choice theory has been on external mechanisms by which an individual can commit their future self. “Rational choice theory struggles to explain how people can avoid temptation,” says Dr Gold. It also struggles to explain why people cooperate in certain situations, such as in the Prisoner’s Dilemma, where the length of sentence two criminal accomplices will get depends on whether they choose to betray each other; Dr Gold

60

says parallels can be drawn between this situation and an individual trying to exert self-control. “In part of the project we draw parallels between how rational choice theory treats persons as members of groups, and persons over time,” she explains. “It turns out that the way the person is modelled over time is analogous to group behaviour in situations like the Prisoner’s Dilemma.”

Gold. “The idea of my project is to think about a new way in which we could model willpower and intentions.” This research involves taking the apparatus of team reasoning, which has already been used with respect to behaviour in groups, where a team of individuals decide on a plan then follow their part of it. Instead of individuals in a group however, researchers now look at time-slices representing the

In part of the project we draw parallels between how rational choice theory treats persons as members of groups, and persons over time. It turns out that the way the person is modelled over time is analogous to group behaviour in situations like the Prisoner’s Dilemma A good example can be provided by returning to the dieting analogy, where an individual aims to commit to a long-term dietary plan. There are models in rational choice theory where a person is split into different time-slices representing what the person wants at each moment in time. “So, before dinner, the agent thinks; ‘well, I’m on a diet and I want to lose weight, so I shouldn’t eat dessert’,” explains Dr Gold. However, the future time-slice may not be willing to forgo dessert, and in rational choice theory its desire to enjoy a piece of cake trumps the longer-term considerations. “Rational choice theory says it’s obvious why we give in to temptation – because people are being asked to bear a cost now that can be redeemed at some point in future, and why would they do that? So of course, left to themselves they won’t do it,” continues Dr

person as a team over time. “So an individual has a plan and it tells them what all the timeslices should do,” says Dr Gold. This provides a formulation by which willpower and intentions can be modelled. “Take a person before dinner following a diet – if they’re taking that team perspective, and thinking of the person as a team over time, then they can formulate a plan. Then, providing the person at dinner also sees it from the perspective of the team over time, and not from their own time-slice perspective, they can follow the plan – there’s a rationale to it,” explains Dr Gold. “That’s willpower basically. Forming a plan is an intention, and carrying it out is willpower.”

Self over time A key question here is how an individual perceives their own self: whether they emphasise their immediate interests, or

EU Research


At a glance Full Project Title Self-Control and the Person: An Inter-Disciplinary Account (TeamControl) Project Objectives The goal of TeamControl is to develop and test an account of self-control, based on the novel concept of a person as a “team over time”. The project bridges philosophy, psychology, and economics, using methodologies from all three. their self over time. Evidence suggests that people who feel more connected to their future selves tend to have a lower discount rate, meaning the rate at which they discount the future benefits of a particular course of action, essentially a measure of impatience. “It effectively means they have an attitude which helps them to avoid temptation,” explains Dr Gold. Researchers are also bringing together elements of other disciplines to develop and test a novel account of how we achieve self-control. “There’s a framework from economics, there’s empirical results from psychology, and then there are also concepts from philosophy about the rationality of acting on an intention,” continues Dr Gold. “The rationality of acting on intentions is problematic in philosophy, because if forming an intention gives you a reason to do something – just because you intend to do it – then it implies that you can give yourself a reason for doing anything, just by intending to do it.” The framework of the project provides a way around this problem, which is known in philosophy as ‘boot-strapping’. By conceiving of the individual as a team over time, researchers have developed a framework in which self-control can be seen as a rational course of action. “At a particular level of agency, namely the person as a team over time, avoiding temptation and acting on intentions can be what’s best for the agent’s plans,” explains Dr Gold. One way of exerting self-control can be to develop a longer-term plan, a framework to exert self-control. “So if an individual thinks the best way to resist temptation is to think of the self as a self over time, extended to consider future consequences, then maybe they can prepare themselves to resist a temptation when it is put in front of them,” says Dr Gold.

www.euresearcher.com

Borderline personality disorder This research holds significant implications for borderline personality disorder, a mental health disorder often characterised by impulsive or impatient behaviour. A set of nine core symptoms have been identified, and people need to display at least five to be diagnosed. “One of the core symptoms is lack of a sense of self, which is very common amongst people with borderline personality disorder. Another one is self-control problems; a lot of people with borderline personality disorder have drug addictions, or they struggle to hold down jobs,” outlines Dr Gold. These symptoms often co-occur, says Dr Gold. “One of the suggestions is that we can use this framework that’s been developed in the TeamControl project to impose some causality on what’s going on in cases of borderline personality disorder. It has been suggested that therapies around the self might have an impact on those lack of self-control problems,” she continues. There are many ways in which people with borderline personality disorder lack a sense of self, or have an unstable self image, an area of great interest to Dr Gold. An individual with borderline personality disorder might act differently around different people, or have a sense of not being the same person over time, which Dr Gold says is distinct from more general feelings of emptiness and lethargy. “It looks like these different manifestations of lack of self have different mechanisms. This is where there are five different aspects of borderline and self over time, and they might not all have quite the same connection with resisting temptation,” she outlines. Dr Gold plans to pursue further research in this area in future. “Research over the next year or so will drill down on that a bit more and present a more nuanced picture,” she says.

Project Funding 1,033, 003 euros Contact Details Dr Natalie Gold Senior Research Fellow Philosophy Department King’s College London Strand, London WC2R 2LS T: +44 (0)20 7848 2594 E: natalie.gold@kcl.ac.uk W: http://selfcontrolandtheperson.com/

Dr Natalie Gold

Dr Natalie Gold is a Senior Research Fellow at Kings College London. Prior to that she was a lecturer at the University of Edinburgh. Her research is on behavioural decision-making and moral psychology. She has worked on topics including framing, moral judgements and decisions, cooperation and coordination, and self-control.

61


Effective online tools can enable students to rapidly find information on concepts and ideas central to their studies. The Śāstravid project is developing a new web-based research tool under the direction of Dr Jan Westerhoff, which will help students of Indian philosophy to explore both the explicit connections between texts and their conceptual connections, as Dr David Gold explains

Supporting research with Śāstravid The study of

Indian philosophy has long involved reading key texts and analysing the thoughts and ideas of a wide variety of thinkers. Now tools are emerging that will help researchers gain new insights into the subject. While the library will of course always be an important resource for students, the Sastravid project’s web-based electronic research tool will bring significant further benefits. “The Sastravid system aids research by revealing connections among and between philosophical texts and concepts. It’s intended to make it easy for researchers to follow those connections in ways that would advance their research goals,” says Dr David Gold, Chief Executive of Bridgeton Research, the company that developed the software. The Sastravid project has been designed by Dr Jan Westerhoff of Oxford University with funding from the European Research commission. During the last four years Dr Jan Westerhoff and his team of researchers have been developing the Sastravid resource with the assistance of Bridgeton Research, who designed and developed the software. The starting point was a deep understanding of the subject matter; Indian philosophy is a diverse subject, and is identified more by the location from which its central ideas were developed than any conceptual unity. “The term ‘Indian philosophy’ refers to a set of schools of philosophy that arose in India during pre-modern times,” outlines Dr Gold. “For the most part they share certain features that differ from most schools of western philosophy.” The project’s focus so far has been on an area of Buddhist philosophy called Madhyamaka, and particularly on the work of Nagarjuna, an influential philosopher

62

from around the 1st–3rd centuries. While it is not entirely clear how many texts he authored, Nagarjuna is widely considered to be one of the most important philosophers in the Buddhist tradition, particularly in the Mahayana schools, contributing central ideas related to the concept of sunyata, or emptiness. “A small group of scholars have contributed all the content that is currently in Sastravid, and it is mostly based on the work of Nagarjuna, and commentaries and translations rooted in his work,” explains Dr Gold. The material that has been gathered is represented within Sastravid according to two key structures. “We have the textual side and the conceptual side. The two components each have their own structure that is appropriate to two different types of material. And the two components are connected to each other in a specific way that reflects their relationship,” says Dr Gold.

Research tool The approach on the textual side reflects the nature of the commentarial tradition in Indian philosophy. Successive generations of philosophers and thinkers have contributed their own commentaries on root texts, generating a large, complex body of literature. “For example, an ancient Buddhist author wrote a philosophical text, which became important in a certain tradition of Buddhism, then other philosophers wrote commentaries on it. Let’s say the original text was written in Sanskrit, then somebody translated it into Tibetan, and then somebody else wrote a commentary on it in Sanskrit, and someone else wrote a commentary on that

The term ‘Indian philosophy’ refers to a set of schools of philosophy that arose in India during premodern times

EU Research


commentary, or on the Tibetan translation.” outlines Dr Gold. “Modern scholars may also have translated the root or its commentaries into modern languages.” Eventually this adds up to a dense network of commentaries and translations, in several different languages, which the project aims to represent within its research tool. “Every one of these commentaries and translations relates to another one in this hierarchy, going all the way back to this one root text,” says Dr Gold. “So we think of all of those as one big hierarchy – one for any number of root texts in the Indian philosophical tradition.” The key here is that these are truly running commentaries within the Sastravid system, with a given section of the commentary linked to the relevant section of text, meaning a researcher using the system can quickly identify a passage of interest, and then easily see its commentaries, or the passage of another text that it may be a commentary on. The material itself has been put in to the system by Sastravid Principal Investigator Dr Jan Westerhoff and his team of researchers at the University of Oxford. “All the material is put in by people who contribute to the system. On the textual side, it’s largely primary sources in Indian philosophy, together with any notes created by contributors” outlines Dr Gold. The system is built in a way that

allows multiple people to contribute to it at once, while new commentaries and translations from modern scholars can also be added. “From a software point of view, the technical side of the project has primarily been about creating the tools and the environment for actually producing the content and structuring it properly, so as to make it as useful as possible to researchers. Any number of people can work simultaneously producing content from anywhere,” says Dr Gold. A typical researcher studying Indian philosophy will be interested in not just the content of the root text and the associated commentaries, though, but also the ideas that are expressed in them. The unique structure of the Sastravid tool represents not only the explicit connection between texts, but also their conceptual connections, so that researchers can rapidly identify relationships between related ideas, even when they are expressed in different texts. “The other major component of Sastravid is a hierarchy of what we call concepts, which are independent of the texts. Concepts are expressed in the texts, but the same idea may be expressed in many different texts,” explains Dr Gold. Researchers can browse or search this hierarchy to find particular ideas of interest, and then use those results to find the particular texts that express those ideas. “When you’re in the texts in the system, you can then find your way out into the concepts, and from there you can find other texts within the system that express the same or related ideas,” says Dr Gold.

For the most part they share certain features that differ from most schools of

western philosophy

Screenshot of the Sastravid.net website.

www.euresearcher.com

63


System permissions

At a glance Full Project Title Sastravid - a new paradigm for the study of Indian philosophical texts Project Objectives The aim of the project is to transform the way Indian philosophical texts are currently studied. We provide a philosophical analysis of a set of central works from the Indian tradition, well known for its demanding content and the conceptual complexity of its arguments. This analysis will incorporate a set of cutting-edge methodological principles. Project Funding ERC funded project Contact Details Project Coordinator, Dr Jan C Westerhoff Lady Margaret Hall University of Oxford Norham Gardens Oxford OX2 6QA United Kingdom T: +44 1 865274261 E: jan.westerhoff@lmh.ox.ac.uk W: www.sastravid.net

Dr Jan Westerhoff

Originally trained as a philosopher and orientialist, Dr Jan Westerhoff’s research focuses on philosophical aspects of the religious traditions of ancient India. Much of his work concentrates on Buddhist thought (especially Madhyamaka) as preserved in Sanskrit and Tibetan sources, he also has a lively interest in Classical Indian philosophy (particularly Nyaya). His research on Buddhist philosophy covers both theoretical (metaphysics, epistemology, philosophy of language) and normative aspects (ethics); he is also interested in the investigation of Buddhist meditative practice from the perspective of cognitive science and the philosophy of mind.

64

This allows researchers to rapidly identify key texts and concepts related to their specific area of interest, from which they can then analyse the material and develop their own ideas. Principal Investigator Dr Jan Westerhoff and his team of researchers at Oxford University control each individual’s level of access to the system. “Any number of people can work simultaneously producing content from anywhere, and there are different levels of permission. Jan and his colleagues can decide what level of freedom to grant the person who is producing content,” explains Dr Gold. People can work either individually or in groups within the Sastravid system, while they can also invite each other to work on specific projects, which can be modified on an ongoing basis. “People can edit projects to whatever degree they want before publishing them, and continue to make changes to them once they’ve been published. People can edit each other’s work and move concepts around, even change the concepts – depending on the level of permission they have.”

somebody with the necessary permissions.” This could in the future potentially encompass key texts and commentaries from other areas of Indian philosophy, aside from Madhyamaka. While the project has so far been mainly centred on Madhyamaka, Dr Gold says there are no limitations within the Sastravid software that prevent the inclusion of material from other schools of thought. “You could certainly go on representing other areas of Indian philosophy without making any changes to the software. The system is designed to handle any level of detail and nuance,” he stresses. “The idea of the Sastravid system is basically to allow researchers to represent an area of thought in a way that makes sense from the point of view of the discipline that’s being represented,” explains Dr Gold. “It guides researchers, helping them to choose a way that states it as clearly as possible.” The use of technology in education is an area of intense debate, with professionals keen to make sure information is accessible. For his part, Dr

The Sastravid system aids research by revealing connections among and between philosophical texts and concepts. It’s intended to make it easy for a researcher to follow those connections in ways that would advance their research goals The system is intended primarily for professional academic researchers, but it could be used by anyone with a basic knowledge of Indian philosophy who wants to take a deeper interest in the subject. The system is not primarily about students and the actual teaching process, but rather about helping advanced students doing original research. “From a student’s perspective, I would think Sastravid is more for people writing a paper rather than for a student who needs a basic understanding of a subject,” says Dr Gold. However, Dr Gold believes the system could be an important resource in collaborative projects, and encourage cooperation between researchers. “The software supports working in a group – so a group of students could work together under a professor’s supervision, and they could create content that’s either visible only to members of the group or made available more widely on the system,” continues Dr Gold. “Everything can be published at the press of a button by

Gold believes that the key issue is to make sure that learning goals are always the primary consideration when using technology in education. “The right tools need to be chosen to support educational goals, as opposed to educational practices bending in order to accommodate the latest technologies. If it’s done correctly, the availability of more and more online tools is a positive development,” he says. Development in this area is ongoing; Dr Gold and his company are working on software for organising ideas in any subject area, with a hierarchy of concepts that can be applied to material of all kinds, anywhere on the web. “For example, research published on a WordPress site, or many sites, could be connected to a centralised hierarchy of ideas that could be overseen by a professor. In that way it builds connections between different sites and publications addressing related ideas, even if the content-creators aren’t working together,” explains Dr Gold.

EU Research


EU



Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.