EU Research WINTER 2018/19

Page 1

EU Research Winter 2018

BREXIT: The Final Countdown

Cure for AIDS could be within reach?

ERC announces â‚Ź573 million to research projects

12 year ultimatum to prevent climate catastrophe

Latest on InSight probe to Mars

Disseminating the latest research under FP7 and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH



Editor’s N I

n this issue there are many incredible innovations and much original research that’s come about through collaboration between scientists from different EU Member States, made possible through funding from EU initiatives like Horizon 2020.

It’s therefore going to be a strange departure for the UK when it disengages from what has traditionally been such a close and free partnership, working together towards shared goals. It’s with hope that the future of science still includes, if not so much shared funding, then at least shared expertise. At the moment of writing I am awaiting specific answers from requests to UK Ministers and the EU Commission on the future of UK science funding. So far, it’s all gone a bit quiet. The only thing that I know for certain is that any funding already secured is safe but as for the future, I am wondering if anyone really knows what’s going on at all. My personal feelings are that there is simply too much at stake for our future generations for nationalism to be the barrier to solving important problems for our survival and progress. Beyond politics of the moment but relevant to politics none the less, are issues that should keep everyone awake at night, such as safeguarding our food security, our response to climate impact and how to reverse desertification.

As a seasoned editor and journalist, Richard Forsyth has been reporting on numerous aspects of European scientific research for over 10 years. He has written for many titles including ERCIM’s publication, CSP Today, Sustainable Development magazine, eStrategies magazine and remains a prevalent contributor to the UK business press. He also works in Public Relations for businesses to help them communicate their services effectively to industry and consumers.

We have touched on this issue before in recent editions of EU Research but recently the New World Atlas of Desertification was created by EU scientists to illustrate the reality of the challenges we are now facing from infertile landscapes. A fact which I find alarming is that 90% of available land could become too degraded for farming food by 2050, that’s only 32 years ahead. Our children could be witness to a world where food resources are seen in a very different light. When vital resources become scarce, control and management of those resources becomes priority and that can lead to conflict and division. The practicalities involved in the restoration of dead landscapes will take a concerted effort with resources, considerable patience, will power and innovation. The one thing we do know for certain is that no outcome for saving the day can be guaranteed and it is hard to feel comfortable with the concept we’ll turn it all around in some unity by design, especially as the current socio-economic panorama is looking so fragmented. Let’s hope that at some juncture, the realisation of our shared impending dilemmas will be fully understood by both people and politicians. In the meantime, the EU is doing what it can to address these issues, so we can hope that success is the outcome.

Hope you enjoy the issue.

Richard Forsyth Editor

www.euresearcher.com

1


Contents 30 Monetary

and Fiscal Policy

16 Experimental Psychology

4 Research News EU Research takes a look at the latest news in scientific research, highlighting new discoveries, major breakthroughs and new areas of investigation

10 CellFusion CellFusion is integral to sexual reproduction and the ongoing development of an organism, yet there are still significant gaps in our understanding of the process. We spoke to Professor Sophie Martin about her work in this area.

12 TransMID Mathematics is an important tool to describe the spread of infectious diseases. Researchers in the TransMid project are developing ever-more sophisticated tools and software to improve the effectiveness of modelling, as Professor Niel Hens explains.

14 Environmental and

Immunological Tolerance Allergic diseases affect around one in four children in the West, representing a significant public health problem. Professor Bianca Schaub and her colleagues are probing deeper into the underlying factors behind the development of allergic diseases.

15 New Peptides for Food Very small molecules can be used to produce functional materials with novel properties, yet it remains difficult to predict how these molecules interact with each other. We spoke to Dr Pim Frederix about his work in developing a computational framework.

2

The visual system seems to give us a continuous view of the world, yet the eyes actually shift position three times a second. This raises some interesting questions around how we achieve perceptual continuity, a topic central to Professor Martin Rolfs’ research.

18 Comfort for

the Troubled Mind Many of us react to stress by indulging in some fatty, sugary food, a reaction which has deep historical roots. We spoke to Dr Frank Meye about his research into stress-eating, which could hold important implications for our understanding of several eating disorders.

20 Mechanochemical

and Supramolecular Self-healing materials are commonly found in nature, yet man-made materials do not typically have the ability to repair themselves after damage and regenerate their function. We spoke to Professor Wolfgang H. Binder about his research into selfhealing and stress reporting.

23 NitroPortugal Nitrogen is essential to life, yet its reactive forms can also have harmful effects on the environment when present in excessive amounts. The NitroPortugal project aims to strengthen the country’s research base in this area, as Professor Cláudia Marques-dosSantos Cordovil explains.

28 Illiquidity, Insolvency and Bank Regulation

Debate continues about how the financial industry can be regulated effectively. We spoke to Professor Gerhard Illing about his research into the feedback between monetary policy and risk-taking in the financial industry.

The 2008 financial crisis posed a major challenge to governments and central banks across the world, as major institutions teetered on the verge of collapse. We spoke to Professor Andreas Schabert about his work in assessing the effectiveness of the policy instruments deployed to deal with the crisis.

31 ECORD

Rocks and sediments from below the ocean floor hold enormous scientific importance, helping researchers learn more about the history of the earth. The ECORD consortium supports ocean drilling, from which important scientific insights can be drawn, as Dr Gilbert Camoin explains.

34 Desertification:

A Threat to Land and Life Desertification is a major concern across the world, as more and more land degrades. We take a closer look at the extent of the problem and its implications for the future.

By Richard Forsyth

38 Tailor Made Fuels Biofuels offer an attractive alternative to conventional fuels, so are widely recognised as a research priority. The Fuel Science Center has been established to work towards the development of biohybrid fuels, as Dipl.-Ing. Bastian Lehrheuer explains.

40 Vertebrate Herbivory The first land-based vertebrates are thought to have fed mainly on animal matter before they later began feeding on plants and other resources. Professor Thomas Tütken is using dietary proxies to investigate when, and in what species, this change occurred.

42 Design of 2D Materials A deeper understanding of complex particles and the way they interact with each other could open up the possibility of researchers using them as building blocks in the design and development of new materials, as Dr Laura Rossi explains.

EU Research


EU Research

58 LoC 44 BUILDUP The universe is thought to be around 13.8 billion years old, and researchers continue to probe ever deeper into its history and evolution. Researchers in the BUILDUP project aim to reconstruct galaxy assembly during the early history of the universe, as Professor Karina Caputi explains.

46 FLEXOP Researchers in the Flexop project are developing new tools to both model flutter and to control it during flight, which could open up new possibilities in design and boost the competitiveness of European industry, as Dr Balint Vanek explains.

48 Projestor The typical Artificial Intelligence (AI) system is currently run on computers which consume kilowatts or even megawatts of power. We spoke to Dr Abu Sebastian about the Projestor project’s work in developing a new memory device concept which could open up new possibilities in AI.

50 SPPEXA The age of exascale computing is coming, and dramatic increases in computational speed could lead to important scientific breakthroughs. Software must also evolve in line with these changes, a topic at the heart of Dr Benjamin Uekerman’s work in the SPPEXA project.

52 Less Confusion from

Dynamic Information Dynamic visualisations can provide valuable insights into time-varying data like stock prices and traffic information. Dr Kevin Verbeek tells us about his work in analysing the stability of the geometric algorithms behind these visualisations.

54 BREXIT and Science Brexit dominates the political agenda in the UK, but what does it mean for science? And how will it affect the relationship between UK scientists and their colleagues in the EU? We take a closer look at the issue. By Richard Forsyth.

www.euresearcher.com

Imagination allows us to conceive of scenarios beyond our own experience, yet it’s also important for everyday practical reasoning. Researchers in the Logic of Conceivability project are using mathematical tools to investigate the logic of the human imagination, as Dr Peter Hawke explains.

60 Memoria Apostolorum Memories of Jesus of Nazareth, as well as reflections on his life and work, are central to the Christian faith, so references to him and apostolic figures abound in the literature. We spoke to Dr Stephan Witetschek about his work in tracing ways of remembering these apostolic figures.

62 Beyond the Elite The history of the Jewish people in Europe dates back to the early Middle Ages. Researchers in the Beyond the Elite project aim to build a clearer picture of daily life in the period between 1100-1350, as Professor Elisheva Baumgarten explains.

65 GrDyAp Groups are of fundamental importance throughout mathematics. Professor Andreas Thom tells us about the work of the GrDyAp project in connecting group theory, functional analysis, and the theory of dynamical systems.

68 MiLifeStatus Debate continues over whether citizenship should be viewed as an incentive for migrants to integrate in local communities, or as a reward for doing so. The MiLifeStatus project is probing deeper into the relationship between migrant naturalisation and integration, as Professor Maarten Vink explains.

70 DOS Domestic service was an important category of labour in early modern and colonial India, yet it has been relatively neglected in research, now Dr Nitin Sinha, Dr Nitin Varma, Sourav Kumar Mahanta and Vidhya Raveendranathan are looking deeper in this area.

Winter 2018

BREXIT: The Final Countdown

Cure for AIDS could be within reach?

ERC announces €573 million to research projects

12 year ultimatum to prevent climate catastrophe

Latest on InSight probe to Mars

Disseminating the latest research under FP7 and Horizon 2020 Follow EU Research on www.twitter.com/EU_RESEARCH

EDITORIAL Managing Editor Richard Forsyth info@euresearcher.com Deputy Editor Patrick Truss patrick@euresearcher.com Deputy Editor Richard Davey rich@euresearcher.com Science Writer Holly Cave www.hollycave.co.uk Acquisitions Editor Elizabeth Sparks info@euresearcher.com PRODUCTION Production Manager Jenny O’Neill jenny@euresearcher.com Production Assistant Tim Smith info@euresearcher.com Art Director Daniel Hall design@euresearcher.com Design Manager David Patten design@euresearcher.com Illustrator Martin Carr mary@twocatsintheyard.co.uk PUBLISHING Managing Director Edward Taberner etaberner@euresearcher.com Scientific Director Dr Peter Taberner info@euresearcher.com Office Manager Janis Beazley info@euresearcher.com Finance Manager Adrian Hawthorne info@euresearcher.com Account Manager Jane Tareen jane@euresearcher.com

EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom T: +44 (0)207 193 9820 F: +44 (0)117 9244 022 E: info@euresearcher.com www.euresearcher.com © Blazon Publishing June 2010

Cert o n.TT-COC-2200

3


RESEARCH

NEWS

The EU Research team take a look at current events in the scientific news

UN ultimatum to save the world from climate catastrophe ‘Unprecedented’ United Nations Report says that humans have 12 years to stop Climate Change becoming irreversible. We have just 12 years to make massive and unprecedented changes to global energy infrastructure to limit global warming to moderate levels, the United Nation’s climate science body said in a monumental new report released Sunday. “There is no documented historic precedent” for the action needed at this moment, the Intergovernmental Panel on Climate Change (IPCC) wrote in its 700-page report on the impacts of global warming of 2.7 degrees Fahrenheit, or 1.5 degrees Celsius. From rising sea levels to more devastating droughts to more damaging storms, the report makes brutally clear that warming will make the world worse for us in the forms of famine, disease, economic tolls, and refugee crises. And there is a vast gulf between the devastation from 1.5°C, what’s considered the moderate level of average warming, and 2°C. “It’s very clear that half a degree matters,” said Valérie MassonDelmotte, co-chair of IPCC Working Group I at a press conference in Incheon, South Korea, where the report was released. Under the Paris climate agreement, nations set a goal of limiting warming to 3.6°F, or 2 °C, increase in global average temperatures, with ambitions of a stricter limit of 2.7°F, or 1.5°C of warming. The UN asked the IPCC to figure out what it would take to hit the 1.5°C target, and what’s in store for the world if we did pull it off. The team pooled more than 6,000 scientific publications, drew contributions from 133 authors, and had more than 1,000 scientists review the findings. As expected, the report doesn’t pull any punches: Staying at or below 1.5°C requires slashing global greenhouse gas emissions 45 percent below 2010 levels by 2030 and reaching net zero by 2050. Meeting this goal demands extraordinary transitions in transportation; in energy, land, and building infrastructure; and in industrial systems. It means reducing our current coal consumption by one-third. It also demands a vast scale-up of emerging technologies, such as those that remove carbon dioxide

4

directly from the air. All in the very narrow window of the next 12 years while our momentum pushes us in the wrong direction. The scientists added that we are already seeing the impact of climate change in the form of extreme weather, rising sea levels, and diminishing Arctic sea ice, among other situations. They said some of the changes needed to slow the temperature rise are already being implemented, ‘but they would need to accelerate’. They see the report being part of these changes. “This report gives policymakers and practitioners the information they need to make decisions that tackle climate change while considering local context and people’s needs,” said Debra Roberts, Co-Chair of IPCC Working Group II. “The next few years are probably the most important in our history.” How long have we got? Not long at all. But that issue is now in the hands of political leaders. The report says hard decisions can no longer be kicked down the road. If the nations of the world don’t act soon, they will have to rely even more on unproven technologies to take carbon out of the air - an expensive and uncertain road. “They really need to start work immediately. The report is clear that if governments just fulfil the pledges they made in the Paris agreement for 2030, it is not good enough. It will make it very difficult to consider global warming of 1.5C,” said Prof Jim Skea. “If they read the report and decide to increase their ambitions and act more immediately, then 1.5C stays within reach - that’s the nature of the choice they face.” Campaigners and environmentalists, who have welcomed the report, say there is simply no time left for debate. “This is the moment where we need to decide” said Kaisa Kosonen. “We want to move to clean energy, sustainable lifestyles. We want to protect our forests and species. This is the moment that we will remember; this is the year when the turning point happened.”

EU Research


Sol 5: Instrument Context Camera (ICC). NASA’s InSight Mars lander acquired this image of the area in front of the lander using its landermounted, Instrument Context Camera (ICC). This image was acquired on December 1, 2018, Sol 5 of the InSight mission where the local mean solar time for the image exposures was 13:04:20 PM. Each ICC image has a field of view of 124 x 124 degrees. Image Credit: NASA/JPL-Caltech

NASA gives new insight on Mars InSight probe reveals desolate landscape as dust settles after its arrival on Martian surface. On 26 November 2018, NASA’s InSight lander, designed to reveal the mysteries of Mars’s interior, descended from space to the surface of Mars. The $814 million lander, developed by the Jet Propulsion Laboratory in Pasadena, California, followed its script to the letter, culminating in the successful unfurling of its critical solar panels. It was NASA’s eighth successful landing in nine tries. The landing was close to a bull’s-eye on its target, a vast lava plain, and the lander’s first images revealed a landscape seemingly ideal for placing the spacecraft’s two primary instruments, a seismometer and heat probe. Over the next 3 months, InSight’s scientists will now slowly place both instruments on the surface, with science operations expected to begin by March 2019. InSight wasn’t the only robot entering Martian airspace for the first time. Two mini-spacecraft, each about the size of a briefcase, were tagging along as part of the first mission to send tiny spacecraft known as CubeSats into interplanetary space.

www.euresearcher.com

“This is a fantastic day for spacecraft, great and small,” says JPL’s Andrew Klesh. “This team of really mostly part-timers has proven the technology that we were trying to demonstrate with this mission.” With enough data from enough different directions, scientists should be able to put together a picture of the planet’s alien heart. A second instrument will be deployed to take the planet’s temperature, drilling deep into Mars to find out how much heat is still escaping from its core. All together, InSight’s readings will help scientists figure out how planets are put together and how they evolve, says Suzanne Smrekar, the mission’s deputy principal investigator. That’s important not only for better understanding our own solar system, but also for deciphering clues about much more distant planets circling other stars. “Really understanding the whole enchilada, not just the surface,” Smrekar says, “is essential to really being able to make a reasonable prediction about what’s going on in these distant worlds.”

© NASA

Collectively known as Mars Cube One, but separately referred to as MarCO-A and MarCO-B, their mission was to collect information from InSight as it descended to the surface, and then relay that information to mission control at JPL. They succeeded

not only in doing that, but also in sending back a stark, evocative image of Mars as they departed.

5


European Union begins Horizon Europe discussion with eight countries Carlos Moedas the European Commissioner for Research is preparing to brief Japan, US, Canada and others about the possible collaboration with the EU’s €94.1 billion R&D programme, Horizon Europe. Argentina, Australia, Brazil, Canada, Japan, New Zealand, South Africa and the US are scheduled to meet with the EU Research Commissioner Carlos Moedas, on 3rd December, in Brussels, to discuss the Commissions biggest-ever R&D initiative worth €94 billion to succeed the current programme, Horizon 2020. The project is expected to start in 2021 and last for seven years. Currently, the vast bulk of the Horizon 2020 money goes to researchers in the EU and its neighbours; from 2016 through 2017, just 0.95% of grants went to researchers from so-called third countries. But Moedas wants to boost collaboration with other wealthy countries, as a way to strengthen European competitiveness in science and technology. So far, that has proven difficult, with international participation falling due to financial and legal issues. And there are signs of third-country participation in Horizon Europe could be even more constrained, with some European politicians pushing to make it a “Europe first” initiative. The Commission claims its programme is the most open in the world to international collaboration, citing collaborative partnerships such as developing tropical medicines with African researchers, or agriculture and biotechnology research with China. However, collaboration with the EU is not always as simple and easy for scientists in rich countries. If unable the demonstrate that they have unique knowledge that their European partners needs then they normally need their own funding to join a Horizon project. But this is not something that they tend to be keen on, due to the budget constraints in their home counties. Even though’s with the money, face a number of legal and contractual complications which have deterred some non-EU universities and research institutions. However, recently rich counties have been able to formally “associate” with the programme, meaning their governments contribute some funding and their researchers can bid for Horizon money alongside Europeans. Sixteen countries, including Switzerland and Norway, already have this status. Exploring these terms have been hampered due to Brexit as the UK has said it wants to associate to Horizon Europe after it leaves the EU, but EU Brexit negotiators banned any discussion of what those terms might be, for fear of complicated the main Brexit deal.

6

On 14 November, the UK-EU negotiators published their draft Brexit withdrawal agreement and political statement on the future relationship, freeing Commission R&D officials to start talking more openly to other governments. Then, on 20 November, Commission staff gave a briefing on Horizon Europe to embassy science counsellors in Brussels. The Moedas meeting on 3 December is the next step. There, the Commissioner is due to brief the ambassadors on international participation in Horizon Europe. The ambassadors are expected to counter with their own governments’ opinions about the programme, the headlines of which were included in their joint letter in March. In it, the ambassadors wrote that Horizon Europe – also known as Framework Programme 9 or FP9 – “should continue to focus on developing scientific excellence in Europe and, at the same time, enhance mutually beneficial partnerships with the rest of the world.” The letter urges the Commission “to consider the systems and framework conditions that support international collaboration with our respective countries on research and innovation, allowing European researchers to work with the most appropriate actors worldwide and with the aspiration of making FP9 truly open to the world. Recognising that our respective research and innovation systems differ, this would include continued simplification and greater flexibility of programme administration, along with active support for third country participation in FP9.” In the end, it’s an open question whether any of the eight countries will associate with Horizon Europe.

EU Research


ERC awards €573 million to research projects The European Research Council has announced the results of its Consolidator Grant competition, awarding a total of €573 million to 291 scientists across the EU. The Consolidator Grant awards, allocated by the European Research Council and funded by the Horizon 2020 programme, give mid-career researchers the opportunity to build up their teams and develop their research. This year’s winners hail from universities and research centres across 21 EU Member States, representing 40 nationalities, and will receive on average around €2 million per grant; 32 per cent of the total funding was awarded to female candidates. European Commissioner for Research, Science and Innovation Carlos Moedas said: “This EU grant provides a real boost to research and innovation in Europe because it gives top scientists the chance to take risks and pursue their best and maybe wildest ideas. I am pleased to see these European Research Council grants will support such a diverse group of people of 40 nationalities working in over twenty countries and that the list of grantees also reflects that we have many excellent women scientists in Europe.”

www.euresearcher.com

Projects awarded the European Research Council Consolidator Grant included: “How does life start?” – a study of DNA programming in cells; “Refugee protection in East-Central Europe in the 20th century” – international research of refugee migration and reception; “Improving quantum cryptography” – developing cryptographic security; “Natural dynamics in bipedal robot locomotion” – developing robots which can functionally walk; “Enhancing augmented reality using a transformable metasurface” – developing device architecture in augmented and virtual reality technologies; “Where are beetles going?” – a study of insects’ navigational capabilities; and “Wage inequality within and across firms” – examining economic gender inequality. The European Research Council received 2,389 research proposals for the Consolidator Grant, of which around 12 per cent received funding. In total funding provided by the grant has created 1,750 jobs for research staff.

7


Antibiotics could protect against Neurodegenerative Diseases An antibiotic, minocycline, can increase the lifespan of roundworms by preventing the build-up of proteins during aging, a study in the open-access journal eLife reports. Protein aggregation causes a number of age-related disorders including Alzheimer’s disease, Parkinson’s disease, amyotrophic lateral sclerosis (ALS), and prion diseases. Scientists at the Scripps Research Institute have now found that an FDA-approved antibiotic, minocycline, can prevent the buildup of proteins in aging roundworms and extend the animals’ lifespan. Crawling C. elegans hermaphrodite worm Credit: Bob Goldstein Although its not known whether the antibiotic could have similar protective effects in humans, the researchers, headed by Michael Petrascheck, Ph.D., an associate professor at the Scripps Institute, suggest the findings in the roundworm model Caenorhabditis elegans could lead to the development of optimized minocyclinebased treatments for neurodegenerative disorders in people. “We have identified minocycline as a drug that can extend lifespan and improve protein balance in already-aging worms,” commented Dr. Petrascheck, who is senior author of the team’s paper in eLife. “It would be great if there were a way to enhance proteostasis and extend lifespan and health, by treating older people at the first sign of neurodegenerative symptoms or disease markers such as protein build-up,” added lead author Gregory Solis. “Our study reveals how minocycline prevents protein aggregation and lays the

foundations for drug-development efforts aimed at optimizing this already-approved drug for a range of neurodegenerative diseases.” “While it is not known whether minocycline extends lifespan in mammals, its geroprotective effects reduce age-associated protein aggregation and inflammation as evidenced by numerous preclinical and clinical studies,” the authors commented. They also suggested that the drug’s suggested mechanism of action offers “a unifying explanation for the many seemingly unrelated effects of minocycline observed in preclinical and clinical studies, including its ability to reduce tumor growth, inflammation, and improve symptoms of fragile X … Translation attenuation by reducing ribosomal load as an MOA provides a simple and compelling explanation for these seemingly unrelated beneficial effects.” “Our study reveals how minocycline prevents protein aggregation and lays the foundations for drug-development efforts aimed at optimizing this already-approved drug for a range of neurodegenerative diseases,” concluded Dr. Petrascheck. And as the authors summarized, “Repurposing FDA-approved drugs such as minocycline using phenotypic screens reveals promising effects outside the primary indication (antibiotic) of minocycline and inevitably leads to promising new drug target(s) and MOAs … our studies on minocycline shed light on the plasticity of longevity mechanisms upon aging and reveal an MOA for minocycline that explains its geroprotective effects.”

Study finds use of Social Media increases depression and loneliness The link between the two has been talked about for years, but a causal connection had never been proven. For the first time, University of Pennsylvania research based on experimental data connects Facebook, Snapchat, and Instagram use to decreased well-being. Ever since sites like Facebook and Instagram became part of daily life, scientists have wondered whether they contribute to mental health problems. In fact, research has hinted at a connection between social media use and depression for several years. A new study, published in the Journal of Social and Clinical Psychology, has added more evidence to the theory. The researchers from the University of Pennsylvania designed their experiment to be more comprehensive than previous studies on the topic. Rather than relying on short-term lab data or self-reported questionnaires, they recruited 143 undergraduate students to share screenshots of their Phone battery screens over a week to collect data on how much they were using social media apps — Facebook, Snapchat, and Instagram. Subjects were told either to maintain their typical social media behaviour, or limit it to 10 minutes per day. Alongside the screen shot data, the researchers also looked at how much the participants experienced fear of missing out, anxiety, depression, and loneliness. “Here’s the bottom line,” said Melissa G. Hunt, a psychologist at

8

the University of Pennsylvania and lead author of the study. “Using less social media than you normally would leads to significant decreases in both depression and loneliness. These effects are particularly pronounced for folks who were more depressed when they came into the study.” She added 18-to-22-year-olds shouldn’t stop using social media altogether, but cutting down might be beneficial. People tend to show a more glamorous, positive, and envious lifestyle on their social media. In fact, over half of millennials admit they portray their relationship as better than it really is. This is a problem because your social media life can become like a negative feedback loop — wanting others to be jealous of your life, while constantly comparing yourself to those on your feed. “If you spend most of your time scrolling through your newsfeed checking out other people’s lives and compare them to your own, you become more at risk of developing (or having worsening) symptoms of depression or anxiety,” psychologist Allison Abrams told Business Insider. “This is especially so in those with low self esteem.”

EU Research


A cure for AID’s could be closer than we think Medical research enters a new era to find ways to eradicate HIV from infected populations. More than 50 years after it jumped the species barrier and became one of the most devastating viruses to affect mankind, HIV remains a stubborn adversary. Treatment has improved dramatically over the past 20 years, but people who are infected will remain so for the rest of their lives, and must take one pill daily – at one time it was a cocktail of 30. But now, as another World Aids Day pulls into view, scientists are beginning to ask if the biggest breakthrough – an out-andout cure for the tens of millions who have contracted the virus – could be in sight.

of those with the virus, and can launch new, resistant attacks if treatment is discontinued. Over the last few years, scientists have determined that destroying this last reservoir of cells will be the future of a cure. “People have been looking into latency long before this became a focus for a cure,” said Dr Jonathan Angel, chief of the Division of Infectious Diseases at Ottawa Hospital Research Institute. “There have been people working on HIV research for a long time, but only recently has the tone shifted.’

The excitement lies in research that is having some success in drawing the virus out of a latent stage (in which it can lie undetected for long periods) so that it could be destroyed.

“In my personal opinion, our therapies have become so good that looking at better therapies has become futile. People are taking a pill once a day with no side affects. That’s hard to improve upon.”

The difficulty in dealing once and for all with HIV is that, unlike other viruses, HIV-infected cells are able to “hide” by entering a resting phase that makes them invisible to our immune system and current treatment therapies.

In fact, the increased efficacy of antiretrovirals has curbed the deadliness of the disease significantly. The virus exists in around 36.7 million people around the world and, although it killed around 1 million people in 2017 according to UNAids, that represents a drop of almost 50% from 2005, when mortality from Aids-related illnesses was at its peak.

These ‘latently infected’ cells then exist in reservoirs in the bodies

Banning plastic ‘worse for the planet’ according to Scottish scientists Banning plastics in an attempt to combat the amount of man-made rubbish piling up in the environment could cause the planet more harm than good, a leading group of Scottish academics have warned. The declaration comes from a group of experts at Edinburgh’s Heriot-Watt University who have set up a new network to examine the issues surrounding plastics. They have hit out at calls to cut usage of plastic or those to ban it all together, claiming the arguments are short-sighted and not based on facts. Though ending use of plastic may seem like a move towards a more earth-friendly future, they say the opposite may be true. The group, from engineering, economics and science faculties, acknowledge the increasing amount of plastic litter accumulating in landscapes and seas must be addressed. However, they are warning against a knee-jerk reaction which could lead to a doubling of global energy consumption and a trebling of greenhouse gas emissions. Professor David Bucknall, chair in materials chemistry at Heriot-Watt’s Institute of Chemical Sciences, insists that there are no obvious ­alternatives. “Almost everything we touch or interact with on a daily basis is made of or contains a plastic of some description,” he said. “Banning or reducing their use would have a massive impact on the way we live. For instance, replacing plastics with alternative materials such as glass and metals would cost more to manufacture due to the energy consumed and resources, including water, required to process them.

www.euresearcher.com

“Efforts should be directed towards creating a circular economy for plastics that integrates product design, use, recycling and reuse of plastics to reduce indiscriminate disposal. There are important gaps in our understanding, but we should not be rushing to conclusions in order to provide makeshift answers.” Ocean campaigners have countered the claims. “We can’t recycle our way out of this one,” Dr Sue Kinsey, waste specialist for the Marine Conservation Society, said. “If plastic is properly used and disposed of, that’s all well and good. But when it falls out of the system, it becomes litter, goes into the sea. “Plastic in the environment is much more harmful than glass or aluminium, especially to wildlife because they eat it, they get entangled in it and it has potential to attract toxins, which get absorbed into human and animal tissues. “It’s true we do have to look very carefully at substitutions where substitutions are needed to make sure we don’t have a reverse effect. But we’re not actually talking about a one-on-one substitution. “We’re talking about reducing use first, then substitution with more sustainable materials – for example, glass can be recycled many, many more times than plastic. Also plastic itself comes from the oil industry, which is a finite resource.”

9


A new perspective on cell fusion

Fission yeast gametes during sexual reproduction. Purple marks sites of polarization for location of the actin focus. Green is expressed in one of the two gamete types.

Cell fusion is critical to sexual reproduction and the ongoing development of an organism, yet many questions remain around the molecular basis of the process. Researchers at the University of Lausanne are combining several different techniques to gain deeper insights in this area, as Professor Sophie Martin explains. The process of

cell fusion is integral to sexual reproduction and the ongoing development of an organism, yet there are still significant gaps in our understanding of it. One of the biggest open questions is that of how two previously separate cells, each surrounded by a plasma membrane, come together to form a single unit. “This is still largely a mystery in many organisms,” says Sophie Martin, a Professor in the Department of Microbiology at the University of Lausanne. This question is at the core of Professor Martin’s work as the Principal Investigator of the Cell Fusion project. “There are also other questions about communication between the cells, and about what happens just after fusion,” she continues. “During sexual reproduction two gametes – in mammals it would be an oocyte and sperm – fuse together. Once they’ve fused, the development of the new organism starts, from the fertilised oocyte, also called a zygote. It’s important that this fusion does not happen repeatedly. What changes in the zygote to inform it that it has fused successfully, and that it may look towards development?”

Cell fusion A model organism called Schizosaccharomyces pombe, a species of yeast, is being used to investigate this process in great detail. This organism has attributes that make it ideal for observing cellular processes. “It’s a single-

10

celled organism, so it’s easy to image, because you have immediate access to the cells, which are not buried deep in a tissue,” she outlines. This model system has proved to be very useful in the past, for instance to unravel the controls of cell division. Professor Martin and her colleagues are using it now to gain insights into the cell fusion process. “It’s relatively easy to modify the genome or to mark endogenous proteins with a fluorescent tag,

Professor Martin and her research group previously observed that the actin cytoskeleton becomes very concentrated at the point of cell-to-cell contact during the preparation for cell fusion. “This reorganization of the actin permits the transport of cargoes to a very restricted location at the zone of cell contact. That, in turn, allows the secretion of enzymes that digest the cell wall, so that the plasma

During sexual reproduction two gametes – in mammals it would be an oocyte and sperm – fuse together. Once they’ve fused, the development of the new organism starts, from the fertilised oocyte, also called a zygote. It’s important that this fusion does not happen repeatedly. What changes in the zygote to inform it that it has fused successfully, and that it may look towards development? so that we can image the fusion process with minimal disruption,” she says. “This allows us to visualize the fusion process, take movies, and investigate how cells are progressively reorganized over time.” This work builds on earlier research in cell biology, in particular the observation that some level of cellular re-organisation takes place at the time that cells fuse together. Through the use of live-cell microscopy,

membranes can come into contact and eventually fuse,” she outlines. Typically there is one point of contact between fusing cells, which allows the exchange of materials. “There are some examples where cells may fuse in several locations, or make several little holes. But typically there’s only one point of contact,” says Professor Martin. A number of different approaches are being applied in the project, including optogenetics

EU Research


process, a topic that fascinates Professor Martin. “How do cells know when to digest their cell wall? If this happened before the gametes are in contact, they would burst,” she explains. Significant progress has been achieved in this respect over the course of the project. “We’ve found that the pheromone signal is emitted and perceived at the site of polarisation. As cells grow closer together, the proximity of the signal and thus its strength increase, and this serves to stabilize the actin focus. So only when cells are close will they focus their actin cytoskeleton and digest their cell wall,” she explains. “This illustrates interesting links between cell signaling and the cytoskeleton and shows the importance of the spatial dimension of signaling.”

Cell fate change after fusion Professor Martin and her colleagues have also gained important results on what happens after cell fusion. Once two gametes have fused, the newly-formed zygote should now differentiate and avoid fusing with another gamete. “We’ve found that this is mediated by a change in the transcriptional capacity of the cell,” she continues. “Before they fuse, the two gametes each carry half of a factor that then re-combines after fusion and forms one entity that very rapidly initiates gene expression.” The initiation of zygotic gene expression essentially informs the zygote that the fusion process has been completed. This prevents a second fusion event, which would result in excessive genomic content, and moves the zygote to the next stage in development. While the project’s research has been conducted using Schizosaccharomyces pombe as a model system, Professor Martin hopes that their results will hold broader relevance beyond this specific fungal species. “We work on proteins and cellular processes that are very conserved through evolution, so they’re not present only in yeast. We hope that a number of our results will be valid not only in the yeast that we’re studying, but more broadly,” she says.

CELLFUSION Cell-Cell fusion in fertilization and developmental biology: a cellular and molecular dissection Project Objectives

The objectives of CellFusion are to depict at the molecular level the process through which two cells fuse together during sexual reproduction. We aim to understand how the cells communicate, assemble a fusion structure, fuse and terminate the process.

Project Funding

ERC Consolidator Grant - Cellular and Developmental Biology.

Contact Details

Professor Sophie Martin, Ph.D Department of Fundamental Microbiology University of Lausanne Biophore Building CH-1015 Lausanne Switzerland T: +41 21 692 3931 E: Sophie.Martin@unil.ch W: http://wp.unil.ch/martinlab Merlini L, Khalili B, Bendezú FO, Hurwitz D, Vincenzetti V, Vavylonis D, Martin SG. Local Pheromone Release from Dynamic Polarity Sites Underlies Cell-Cell Pairing during Yeast Mating. Curr Biol. 2016 Apr 25;26(8):1117-25. Dudin O, Merlini L, Martin SG. Spatial focalization of pheromone/MAPK signaling triggers commitment to cellcell fusion. Genes Dev. 2016 Oct 1;30(19):2226-2239. Merlini L, Khalili B, Dudin O, Michon L, Vincenzetti V, Martin SG. Inhibition of Ras activity coordinates cell fusion with cell-cell contact during yeast mating. J Cell Biol. 2018 Apr 2;217(4):1467-1483. Vještica A, Merlini L, Nkosi PJ, Martin SG. Gamete fusion triggers bipartite transcription factor assembly to block re-fertilization. Nature. 2018 Aug;560(7718):397-400.

Professor Sophie Martin, Ph.D

Photo by Félix Imhof © UNIL

and biochemical methods, with the wider aim of gaining new insights into the cell fusion process. In the case of yeast cells, cell fusion takes place during sexual reproduction, which can be triggered by starving the cells. “We can induce the process by simply starving the cells – typically we withdraw nitrogen,” explains Professor Martin. When the two mating types of yeast are present in a cell population in these circumstances, Professor Martin says they will then start to communicate. “That communication is based on chemical perception; the cells secrete pheromones, which will diffuse in the medium. The cells can sense these pheromones through receptors which they express on their cell surface and differentiate into gametes,” she continues. “The pheromones are short peptides which are secreted outside the cell. They’re detected by the partner cell, by receptors that are part of a very large family, called G-protein coupled receptors.” Once the cells have detected these pheromones, they then face the challenge of precisely locating the pheromone source before fusion can take place, because the growth machinery needs to be accurately positioned for the cell to grow in the direction of a partner cell. “This is highly dependent on a protein called Cdc42, which is part of a family of enzymes called small GTPases,” says Professor Martin. Cdc42 has been found to be an important factor in cell polarisation in pretty much all of the organisms in which it has been investigated, now Professor Martin is looking at its role in Schizosaccharomyces pombe. “The aim is to try and understand how Cdc42 becomes active in the right direction, on the right position at the periphery of the cell – the one that faces the partner cell,” she outlines. Once Cdc42 is active at the right place, the cell grows towards its partner and both cells eventually come into contact. “Only then should the cells focus their actin cytoskeleton and digest their protective cell walls to start the fusion process,” says Professor Martin. The question then arises of how the cells coordinate the progression of the fusion

Professor Sophie Martin, Ph.D is a Full Professor in the Department of Microbiology at the University of Lausanne. She previously held research positions in Europe and America, including post-doctoral training at Columbia University in New York, where she studied cell polarization and the cytoskeleton in the fission yeast.

Fission yeast gametes (green and purple) normally fuse only once during fertilization. In abnormal mutant situations, a second cell fusion event can be observed.

www.euresearcher.com

11


The maths behind disease transmission Mathematics is an important tool in describing the spread of infectious diseases, now researchers in the TransMID project are developing sophisticated new methods to model transmission even more accurately. We spoke to Professor Niel Hens about the project’s work and its implications for public health. The use of

mathematics to model infectious diseases dates back to the work of Daniel Bernouli in the late 18th century, who developed a model to assess the impact of eradicating smallpox. Mathematics is now a central tool in this area, with ever more sophisticated methods emerging to model the spread of infectious diseases, a topic at the heart of the TransMID project. “We use mathematics to describe how infectious diseases are spread. We try to inform these models, as much as possible, with data,” says Professor Niel Hens, the project’s Principal Investigator. This means serological and social contact data, both important sources of information in terms of understanding the spread of airborne infections. “There are two parts to the project. The first involves looking at social contact data, and assessing how predictive they are for the transmission of airborne infections,” continues Professor Hens.

Social contact data A large amount of data is available for Professor Hens and his colleagues to probe deeper into this area and gain more specific insights. The first large-scale social contact survey was conducted as part of the earlier Polymod project, which has since been supplemented by further data. “We ask people about who they met during a single day, and then they have to record different levels of non-sexual intimacy. For example, do you meet that person everyday? How long did you meet that person for? At what location did you meet that person?” outlines Professor Hens. By analysing this data and drawing comparisons with infectious disease data on the ground, Professor Hens and his colleagues hope to learn more about which types of contacts are responsible for the transmission of airborne infections. “We can look at different levels of intimacy, and try to disentangle which ones explain what we see in practice,” he explains. Researchers have conducted a study on the varicella-zoster virus (VZV) for example, which causes chicken pox in young people. The study showed that physical contacts lasting at least 15 minutes were the most predictive

12

in terms of explaining the transmission of the virus, while Professor Hens has also looked at other viruses. “We also looked at parvovirus B19, and there transmission was associated with close contact lasting at least an hour or more. So different levels of intimacy are needed to describe different infections,” he says. The demographic profile of the population is another important factor behind the transmission of disease. “We know for instance that children and teenagers are much more socially connected than adults and the elderly. Children also have much more mucus secretion, and therefore they can transmit a virus to others more easily,” points out Professor Hens.

Serological data The second part of the project’s work involves analysing serological data, so data from blood samples, with the goal of developing software that can be applied more widely in disease management. With the serological data, Professor Hens aims to dig deeper into

the underlying factors behind the spread of disease. “Which infectious disease parameters can we estimate from this data? What can we learn about susceptibility in a population?” he outlines. This research holds important implications for the management of disease, with researchers in the project looking at pertussis (also known as whooping cough), cytomegalovirus and measles. “Let’s take the example of measles. The World Health Organisation (WHO) has set the goal of eradicating it by 2020,” says Professor Hens. “This is a very challenging target, because people are increasingly reluctant to be vaccinated against it, especially in developed countries.” This serological data, analysed together with the contact data in relation to transmission, could potentially enable scientists to identify whether disease outbreaks are likely to occur. These types of forecasts are not always completely accurate, yet Professor Hens believes they can be a valuable tool in managing disease

Image plot of social contact rates by age reproduced from Goeyvaerts et al. (JRSS-C, 2010).

EU Research


and identifying those sections of the population that may be at risk. “What are the age-groups that should be targeted in a measles vaccination campaign? That’s one of the potential applications arising out of our research,” he explains. Different diseases of course have different characteristics; cytomegalovirus is quite a complex case, in the sense that it is thought to influence the immune system, yet the underlying mechanisms are not fully understood. “We are trying to describe the dynamics of CMV infections. CMV is known to re-activate in the body when you’re exposed again to a certain virus. You can also be re-infected,” says Professor Hens. The serological data on pertussis is more difficult to interpret, so the project’s work in this area is more exploratory in nature,

Further data would be very valuable in these terms, and Professor Hens believes that a more comprehensive approach to collecting serological data would lead to wider health benefits. The Netherlands for example already has a programme of collecting blood samples. “They actually test samples from the population around every seven years. So they have 12,000 samples which they test for specific pathogens,” says Professor Hens. Analysis of this data can lead to valuable insights into the general health profile of the population, says Professor Hens. “You can learn about how susceptible the population is, and about the dynamics of certain infectious diseases,” he explains. “I believe we should really be collecting and sharing this data on a very regular basis, in as many countries as possible.”

We use mathematics to describe how infectious diseases are spread. We try to inform these models, as much as possible, with data. with researchers looking to probe deeper into the virus. With most infections, when the amount of antibodies in a blood sample goes above a certain threshold, then that is taken as evidence that they have previously been infected. “This is what people call a correlate for protection. So if you have a high presence of antibodies in your blood, then it’s an indication of past infection,” says Professor Hens. This is not the case with pertussis however. “With pertussis, serology does not offer this correlate of protection. We do not really know what the threshold is - there are some indications, but it’s not clearly known,” continues Professor Hens. “We’re trying to see what information we can gain from the serological data on pertussis.”

Data sharing A data-sharing initiative has already been established with this goal in mind, with data available via (http://www.socialcontactdata. org/). There is a lot of interest among the research community in analysing this data and sharing more findings, which Professor Hens believes could lead to more detailed insights in future. “If we can share information more widely then we can ask more relevant questions. Instead of looking at just one specific setting or one specific disease, we can take a more general look at the situation. Alongside making tools and software available, we also want to push forward an open science attitude towards research,” he outlines. Artwork by Frederik De Wilde based on social contact data. Studio@frederik-de-wilde.com

www.euresearcher.com

TRANS-MID Translational and Transdisciplinary research in Modeling Infectious Diseases Project Objectives

TransMID aims to develop novel methods, tools and software to estimate key epidemiological parameters from serological and social contact data, using new statistical and mathematical theory. TransMID is transdisciplinary in nature with applications on diseases of major public health interest (e.g. as pertussis, cytomegalovirus, measles) and translational as the unifying methodology will be extendable to other diseases and settings, which should maximize TransMID’s impact on public health in Europe and beyond.

Project Funding

Hasselt University: 1.202.484.90 euro (including funding for 1 postdoc researcher and 2 PhD students) Antwerp University: 435.682,98 euro (including funding for 1 0.8FTE postdoc researcher) This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement 682540 — TransMID)

Contact Details

Project Coordinator, Professor Niel Hens Universiteit Hasselt - Campus Diepenbeek Agoralaan Gebouw D - B-3590 Diepenbeek Kantoor E109 T: +32 496 749191 E: niel.hens@uhasselt.be W: www.uhasselt.be

Professor Niel Hens

Niel Hens is professor at UHasselt and UAntwerp where he holds the chair in evidence-based vaccinology. He is also He is a biostatistician and mathematical epidemiologist with 12 years of experience in human epidemiology. He is currently copresident of the Young Academy of Belgium.

13


Allergic diseases go under the research microscope Allergic diseases are very common among children in the western world, and they have a serious impact on quality of life for both patients and their families. Professor Bianca Schaub tells us about her research on the immunological mechanisms behind allergic diseases, and its importance for the development of preventive strategies. Around one in four children in the western world suffers from allergic diseases and the prevalence continues to rise, representing a significant public health problem. While a number of factors are thought to be involved in the development of allergic diseases, over recent decades a lot of attention has centered on the pre-natal environment, as well as the environment in the early months and years of life. “It has become apparent that these factors are probably more important than was previously thought,” says Professor Bianca Schaub. As head of the research group on allergy and immunology at Ludwig-MaximiliansUniversität München (LMU Munich), Professor Schaub aims to probe deeper into the development of allergic diseases, focusing on four. “The four major diseases which we are looking at are food allergies, atopic dermatitis, asthma, and hay fever,” she outlines. Environment and immunological tolerance development with a focus on the development of allergic diseases The CHAMP Consortium aims to investigate the determinants of different allergic diseases (food allergy, atopic dermatitis, asthma, hay fever) across the whole pediatric age range. Particular attention will be given to factors determining primary tolerance (no onset of disease) and acquired tolerance (remission of existing disease). We will identify clinically relevant biomarkers predicting early onset, progression and remission. Professor Dr. Bianca Schaub Dr von Hauner Children´s Hospital Munich, LMU Munich Lindwurmstr. 4, 80337 Munich, Germany E: bianca.schaub@med.uni-muenchen.de

Bianca Schaub is head of the Allergy-Immunology research group and Deputy Head of the Asthma and Allergy Department and Attending at the Pulmonary Department of Munich University Children´s Hospital, Munich, Germany. After studying medicine in Munich, Dundee, and Melbourne, she completed her internship and residency training in Pediatrics. After her PostDoctoral Fellowship at Harvard University, Boston, USA she returned to LMU Munich, Germany.

14

CHAMP consortium There are thought to be several sub-phenotypes or endotypes - among these diseases, while some children may suffer from more than one condition, so this is a complex area of research. Professor Schaub is Principal Investigator of the CHAMP (CHildhood Allergy and tolerance: bioMarkers and Predictors) consortium, in which large volumes of data have been gathered. “Several national and international birth cohorts are included in CHAMP, with a long-term follow up,” she explains. This provides a solid basis for researchers to investigate the determinants of different allergic diseases. CHAMP focuses on factors determining primary tolerance (no onset of disease) and acquired tolerance (remission of existing disease). The wider goal in research is to identify biomarkers which predict the onset, progression and remission of allergic disease, through which Professor Schaub hopes to lay the foundations for more targeted treatment in future. “My major focus is on the early years of life, and even pre-natally, to better understand the factors which contribute to the early development of allergic diseases. This is with a view to developing preventive strategies and preparing for future clinical trials,” she continues. Uniquely, this consortium has the opportunity to investigate the development and remission of childhood allergies at all stages of childhood immune and organ development. Another major topic of interest is whether immune development, and the different risk or protective factors, are distinct for each allergic disease. “For example, it may be different for food allergies, atopic dermatitis, and hay fever,” explains Professor Schaub. “We are looking at data from children with different allergic disease across distinct age-ranges, from birth, throughout preschool, school-age, and adolescence.”

Gaining novel insights into the early onset of different allergic diseases, as well as natural tolerance and remission, from birth through to adolescence, will lay the ground for future development of preventive strategies. In the long term, this will open up new avenues for the development of novel therapeutic options, which will have a beneficial impact on the lives of patients and their families. In addition to CHAMP, several other projects are ongoing, which assess mechanisms of protection against allergies. “We look at different protective environmental exposures. We have a few very strong cohorts on this, where we see protective environmental exposures,” says Professor Schaub. “For example, microbial exposures in different rural environments.” Researchers are investigating whether an immune system which has been exposed to different environmental exposures in the early years of life - even prenatally - then proves to be more effective in maintaining a healthy balance. The immune system may get primed to develop counter-regulatory mechanisms against known risk factors. “The idea of early microbial stimulation is a diverse general stimulation of different levels of the immune system,” says Professor Schaub. A deeper understanding of allergic diseases would allow researchers to look towards developing improved preventive strategies. One priority is to enable much more individualised treatment. “This is already practiced in other fields of medicine, for example in cancer, where patients get quite targeted therapies,” points out Professor Schaub. The goal of enabling treatment tailored to the needs of the patient is central to Professor Schaub’s future research agenda. “This is a huge field, which is still not specific enough. This is a very important area of research,” she stresses.

EU Research


Very small molecules can be used to produce functional materials with novel properties, yet it remains difficult to predict how these molecules interact with each other. We spoke to Dr Pim Frederix about his work in developing a computational framework that could help introduce a new level of predictability to biomaterial development.

A new path to biomaterial development

Three snapshots from a simulation of peptides self-assembling from randomly dissolved molecules into hollow vesicles, after which the vesicles fuse into a hollow tube. The peptide backbone forms the outer surface (shown in red) and peptide side chains the interior of the tube walls (in white).

The ability to make biomaterials out of very

Biomaterial development

small molecules is still relatively new, building on the discovery in the ‘90s that these molecules, specifically peptides, can be used to produce the same functional materials as larger proteins. This spurred further research into using different

This means introducing more predictability and structure into biomaterial development, work which could be of great interest to the commercial sector. While chemists nowadays can synthesise molecules with a large degree

People would think about what materials would work for a certain application, but they would never really know whether they had made a good choice until they tried to synthesise such a material. The idea of the project was to simplify this process and make it cheaper. types of molecules, derivatives, and shorter and shorter peptides to develop biomaterials for certain functions. “This is about really going down to the basic buildings blocks of the biological world, to see what you can still achieve in terms of materials,” says Dr Pim Frederix. Many promising molecules have since been found, yet research still often operates on a trial-and-error basis, an issue that Dr Frederix is addressing in a project funded by the Netherlands Organisation for Scientific Research (NWO). “People would think about what new materials would work for a certain application, but they would never really know whether they had made a good choice until they actually tried to synthesise such a material in the lab,” he explains. “The idea of the project was to simplify this discovery process and make it cheaper by using computer simulations before testing it in the lab.”

www.euresearcher.com

of control over their properties, it remains very difficult to predict how they will interact with other molecules. “This is where simulations have really proved useful, in looking at how molecules could come together to form a material,” says Dr Frederix. Researchers are simulating molecular dynamics, aiming to understand their central characteristics. “We’re using a coarse-grain method, where we group certain amounts of atoms into single particles, and then study those particles. This can be done at various levels. We put four heavy atoms into a particle, based on functional groups in chemistry. A peptide amide group would be a certain type of particle for example. In this way we can distinguish between amino acids,” outlines Dr Frederix. “At the same time we are also speeding up simulations by a factor of roughly 1,000, so we can see more

things happening and identify whether those interactions are important for a system.” The ultimate goal in this research is to assess how small biomolecules interact with raw materials, including not just peptides but also lipids, proteins and other molecules. From there, researchers can then look to identify those materials which may be wellsuited to specific applications, for example in the food or pharmaceutical industries, and the research team has already identified several promising peptides. “We are in touch with a small company that uses our results to make biomaterials for cell culture,” says Dr Frederix. There are also other potential applications, including in sun protection. “A few peptides have been discovered to polymerise into compounds resembling melanin, which provides protection against UV radiation. A research group is currently looking into putting that product onto the market,” continues Dr Frederix. “We also ran a project with a company that makes icing for cakes, so there’s a food component as well. We hope to continue finding new applications.”

NEW PEPTIDES FOR FOOD New peptides for food, pharmaceuticals and functional materials Dr Pim W.J.M. Frederix Post-doctoral research associate Groningen Biomolecular Sciences and Biotechnology Institute Stratingh Institute for Chemistry Rijksuniversiteit Groningen T: +31 (0)50 363 4329 E: p.w.j.m.frederix@rug.nl Dr Pim W. J. M. Frederix received his MSc Chemistry from Radboud University Nijmegen, The Netherlands. He moved to University of Strathclyde (Glasgow, UK) where he completed his PhD in Physics on the topic of ultrafast spectroscopy and bionanotechnology. After postdocs at University of Strathclyde and University of Groningen, he received a VENI grant from the Dutch Organisation for Scientific Research to pursue his interest in how small molecules self-assemble into nanostructures in the group of Siewert-Jan Marrink, using a combination of spectroscopy, microscopy and computational modelling.

15


An active vision of visual perception The visual system seems to give us a continuous view of the world and the objects around us, yet the eyes actually shift position three times a second through movements called saccades. This raises some interesting questions around how we achieve perceptual continuity, a topic of great interest to Professor Martin Rolfs. Our eyes seem to give us a continuous, unbroken view of the world, yet in reality they shift position from one location to the next three times per second. These shifts are called saccades, and within a time as short as a bowshot, the eyes move to gather information at a new location. “This is quite different to what most people report about how they perceive the world,” says Martin Rolfs, a Professor of Experimental Psychology at Humboldt-Universität zu Berlin. This raises some interesting questions, particularly around perceptual continuity. “How do we perceive a stable, continuous world when the eye is jerking around and creating an image on the back of the eye, on the retina, that is displaced every 300 milliseconds?” asks Professor Rolfs.

Active perception and cognition This question is a central part of Professor Rolfs’ agenda, with researchers at his laboratory investigating the underlying processes behind active perception and cognition. Traditionally, perception was looked at in a passive way; for example, people participating in a study were asked to fixate on a point on a screen for a period of time. “They were presented with stimuli. People reported on these stimuli and reached a judgment about them,” outlines Professor Rolfs. But in daily life, we don’t typically fixate on certain objects for long periods of time, so now researchers embrace the fact that the visual system actively collects information about the world. “We move our eyes all the time, and these eye and body movements affect what we perceive,” explains Professor Rolfs. The team at Professor Rolfs’ laboratory is now investigating this area in greater depth, using a variety of techniques to revise our picture of perception and cognition. This includes building on ideas from computer vision, where scientists have been working to develop cameras capable of recognising certain objects in images, which is crucial to some emerging technologies. “Self-driving cars need to recognise and track the things around it constantly,” points out Professor Rolfs. The brain does this on a continuous basis, so Professor Rolfs believes human vision and computer vision can learn from

16

each other. “When they move a camera, for instance, they learn more about the structure of a scene than on a static image,” he explains. This is because the movement itself creates relative motion of objects in a scene, so things that are further away are read more slowly, while those that are closer are read more rapidly. The idea of active vision became more prominent in the computer vision community as a result, with researchers starting to move the camera more to understand a scene better. A similar shift has happened in neuroscience

crude information picked up in the periphery of the visual field, on the basis of which a decision is reached on where to move the eyes. “You then move them to a new location – but before you move, you shift attention there, and start analysing what’s going on in much more detail,” says Professor Rolfs. Before the eye moves to a new location, the brain starts processing that part of the scene much more rigorously than any other. “It starts ignoring other things and the part that you’re moving the eye towards becomes central to your

How do we perceive a stable, continuous world when the eye is jerking around and creating an image on the back of the eye, on the retina, that is displaced every 300 milliseconds? and psychology. “We are trying to understand the brain and how it works with visual information. People have started looking at the interaction between eye movements and perception as well as cognitive processes like memory,” says Professor Rolfs. “We can probe what someone sees and what they ignore in an experimental setting.” A lot of information is typically available in a visual scene, whether it’s shop names, car registration numbers or petrol prices, and we all have to work out what we want to focus on at any given point in time. This is based on very

awareness,” says Professor Rolfs. This happens every time that the eyes move to a new location. So there is an ongoing interaction between this covert selection of certain parts of a visual scene, and the subsequent movement of the eyes, which allows us to gain more detail about a certain part of a scene. “When you place your fovea, the centre of your gaze, on a certain point in a scene, you have a much higher resolution on that point. You can see much finer details,” explains Professor Rolfs. An attention shift precedes this eye movement. “Before the

A participant in an experiment is looking at – and reporting on – stimuli on a screen, while her eye movements are tracked (Credit: Julius Krumbiegel).

EU Research


eyes move, the brain provides the mind with a preview of the target of your eye movement. Then the eyes move and you get a really crisp image on the fovea, the central, high-resolution part of the retina,” continues Professor Rolfs.

Understanding psychological disorders

EXPERIMENTAL PSYCHOLOGY Experimental Psychology: Active Perception and Cognition Project Objectives

When observers actively explore and manipulate their environment, the fundamental processes of perception, cognition, and movement control become intricately related. To understand perception and cognition in active observers, we leverage a broad range of methods including eye and motion tracking, psychophysics, computational modeling, EEG, robotics, and studies of clinical populations.

Project Funding

Martin Rolfs’ research is financially supported by the German Research Foundation, the Deutsche Forschungsgemeinschaft (DFG; grants RO3579/8-1 & RO3579/9-1). The collaboration with Katherine Thakkar at Michigan State University is supported by the National Institute of Health (NIH; grant 1 R01 MH112644-01A1).

Contact Details

Project Coordinator, Martin Rolfs, Prof. Dr. phil. Heisenberg Professor für ‘Allgemeine Psychologie: Aktive Wahrnehmung und Kognition’ Institut für Psychologie | HumboldtUniversität zu Berlin Unter den Linden 6 | 10099 Berlin, Germany T: +49 (0)30 2093-6775 E: martin.rolfs@hu-berlin.de W: http://rolfslab.de W: www.martinrolfs.de

Recommended Reading Rolfs, M. (2015). Attention in active vision: A perspective on perceptual continuity across saccades. Perception, 44, 900-919. Thakkar, K.N., Diwadkar, V.A., & Rolfs, M. (2017). Oculomotor prediction: a window into the psychotic mind. Trends in Cognitive Sciences, 21, 344-356.

Professor Martin Rolfs

Martin Rolfs is Heisenberg Professor for Experimental Psychology at HumboldtUniversität zu Berlin. His research focuses on dynamic processes of visual perception and cognition in active observers, looking at how the movements of our eyes, heads and bodies confine what we perceive and how we perceive it.

Electrodes are attached to the EEG cap that will then be placed on the head of a participant in an experiment (Credit: Julius Krumbiegel).

Photo by Kopf & Kragen

The goal in research now is to build towards a theory of active vision, of how continuous perception is achieved, despite the presence of all these movements and changes on the retina. This could also lead to new insights into how attention works in disorders like schizophrenia, autism or attention deficit hyperactivity disorder (ADHD), a condition characterised by difficulties in paying attention to a specific task, although research suggests this is accompanied by heightened perception elsewhere. “One of my students has shown that kids with ADHD are better at detecting unexpected changes somewhere in the visual field than a normal person. They had a broader focus of attention and were easily distracted,” explains Professor Rolfs. “If they are given a task that requires more distributed attention, they might actually perform better than other people.” This is illustrated by the example of a well-known video in which viewers are asked to count the number of times players in a basketball game switch the ball from hand to hand. Most people become so absorbed in the task that they fail to notice a gorilla walking across the screen. “This might seem surprising to many people, but it shows how much we ignore in everyday vision. We just focus on whatever we need to focus on at that particular moment,” says Professor Rolfs. By contrast, Professor Rolfs says that children with ADHD often noticed the gorilla straight

away. “This suggests that their attention system is much more excitable by external stimuli while they are engaged in a certain task, but more research is needed in this domain,” he explains. Researchers have also looked at patients with other disorders, including schizophrenic patients. One theory about the cause of schizophrenia symptoms, such as hearing voices and experiencing hallucinations, relates to visual perception and cognition. “A lot of the processes that I’ve described are predictive. So you shift attention to the target of your upcoming eye movement – you’re predicting where the eye will be, and effectively anticipating the movement and gathering information at that location,” outlines Professor Rolfs. “One theory about schizophrenia is that patients’ brains might be making these predictions, but the other part of their brain – that would normally deal with them – are not receiving them in some way.” An experiment has been developed in collaboration with Professor Katy Thakkar at Michigan State University to build on this research, with scientists looking to gain deeper insights into schizophrenia through tests on the eye movement system. “That could eventually lead to a diagnostic tool, and also help us understand where the symptoms of schizophrenia come from,” says Professor Rolfs. Research into the eye movement system could prove to be an effective way of learning about schizophrenia, and Professor Rolfs plans to pursue further research in this area in the coming years. “Together with Professor Thakkar we’re investigating predictive processes relating to eye movements of schizophrenic patients, and this is something we need to understand better in the future,” he says.

This image shows a participant in an experiment wearing an EEG cap. The EEG cap picks up the electrical activity of the brain while participants perform psychophysical tasks (Credit: Julius Krumbiegel).

www.euresearcher.com

17


Stress and food: investigating the neural connections The vast majority of people nowadays experience stress in very different ways to our ancestors, but we often still respond in a similar way, namely by eating food. Dr Frank Meye tells us about his research into links between certain changes in the brain and stress-eating, work which could lead to new insights into several eating disorders. Many of us

react to a stressful day by treating ourselves to some sugary, fatty food, which often helps us to relax and unwind. This behaviour has deep historical roots, as it makes perfect sense from an evolutionary point of view, helping us to deal with stress efficiently. “Say you’re stressed because you’ve just been chased by a tiger, and you’ve used a lot of energy. When you’ve dealt with the stress, it makes sense to recover energy and probably even to consume more energy than you would have otherwise,” explains Dr Frank Meye. Tigers are not a common hazard nowadays, yet many of us still respond to modern forms of stress in a superficially similar way, a topic central to Dr Meye’s research. “The focus in my project is on stress-eating as a symptom not only of obesity, but also several eating disorders, for example bulimia nervosa,” he outlines.

Stress eating The wider goal in this research is to understand how stress-eating comes about, which holds importance for our understanding of not only obesity, but also several binge-eating disorders. Stress in this sense typically means the social and psychological stress that many of us experience on a daily basis, which can take several different forms. “Stress is not really a uniform concept – the kind of stress matters, and also the intensity of it. The effects can be different depending on if you have moderate or extremely high levels of stress, ” explains Dr Meye. A little bit of stress can be beneficial for an individual in some circumstances, making them feel involved in a task and driving them to

18

perform well, but too much stress will have the opposite effect, which Dr Meye says may affect an individual’s attitude to food. “If stress becomes too intense, then you may become depressed, to the point where you lose interest in food,” he points out. This is not the main point of interest in the project, so researchers instead aim to mimic relatively moderate levels of stress in an animal model. In this model a mouse is paired with a larger, aggressive male. “The larger male asserts itself and shows who is the alpha male, and that leads to some fighting,” he says. The fighting is limited to a fairly short period, but for the rest of the time the

mice are co-habiting, separated by a semipermeable wall, so they are aware of each others’ presence. “There’s no actual physical stress at that point, nevertheless there is a looming stress,” continues Dr Meye. “We look at the response of the stressed mouse to this situation on the behavioural level in terms of food intake. We look at how much food the stressed mouse consumes, and we also give them a choice of food, to understand not only how much they eat and how many calories they consume, but also where they get their calories from.” The mouse is given a choice between bland but nutritious laboratory food,

We’re mainly focused on electrophysiological recordings from neurons embedded in identified brain circuits that we think are involved in stress-eating. typically referred to as chow, and a more fatty, tasty option. The data, both in terms of the amount of food consumed and the type, can then be compared to that from non-stressed mice, and Dr Meye has gained some interesting results. “The stressed mice show a greater preference for the tasty foods, and their overall calorie intake is increased as a consequence of that,” he outlines. Researchers are aiming to understand the neural circuits that are involved in this kind of behaviour. “We’re mainly focused on electrophysiological recordings from neurons embedded in identified brain circuits that we think are involved in stress-eating,” says Dr Meye.

EU Research


COMFORT FOR THE TROUBLED MIND Unraveling the neural circuits of stress eating Project Objectives Neurons as viewed under a microscope. The green fluorescence in two cells identifies them as dopamine releasing. A thin glass patch pipette, coming in from the right, is placed on the top neuron to record its activity.

The goal of this study is to determine how stressors act on the function of specific neural circuits involving midbrain dopamine neurons, to enhance our proclivity for palatable food. To this end we use a mouse model of stress, and then use techniques that allow us to assess how such stress alters the function of midbrain dopamine neural circuitry, and what the consequence of this is for behaviors like intake of comfort food.

Project Funding

The funding for this particular project came from NWO (Veni Grant). https://www.nwo.nl/en/funding

Contact Details

A patch-clamp electrophysiology set-up allowing the recording of electrical activity of neurons identified as embedded in particular pathways in the brain.

A lot of attention here is focused on the dopamine system, which has been linked to reward processing. One hypothesis suggests that stress results in the dopamine system being primed in some way, so that both humans and animals are more inclined to seek out these highly-rewarding forms of food.“We make recordings of neurons, to see if they are indeed in a state where they’re more prone to firing,” outlines Dr Meye. Researchers are also using optogenetics to highlight specific pathways in the brain, from which more can be learned about their role in stress-eating. “Optogenetics has allowed us to make sure that only specific groups of neurons within a given piece of tissue become sensitive to the light. So we can stimulate neural circuits with much higher precision than previously possible,” explains Dr Meye. “We’re trying to understand exactly which pathways to, say, the dopamine system, are sensitive to stress.”

Preventative approaches This gives researchers a basis on which to investigate the changes in the brain that are involved in the stress-eating response in some way. From there, Dr Meye is also considering further steps, to actually prevent stress-eating from occurring. “We could take an approach where we try to silence a particular pathway to the dopamine neurons after stress, and see if that in itself is enough

www.euresearcher.com

to prevent the stress-eating from occurring,” he explains. This would be a step towards proving a causal link between stress-induced changes in the brain and eating behaviour, which is a major priority in research. “That would be the goal – to show which changes are instrumental in driving that behaviour, and to identify strategies that can be employed with the mouse to prevent stress eating from occurrring,” continues Dr Meye. The mapping of which neural circuit changes are responsible for stress-eating in mice may not directly lead to translational approaches in humans, yet Dr Meye believes it could help researchers identify the areas in which attention should be focused. Over the short term, Dr Meye and his colleagues aim to build a clearer picture of which changes in the brain are associated with stress-eating. “We hope to have a thorough understanding of which parts of the brain that communicate with the dopamine system are actually altered by stress, and what the consequences of that are for stress-eating behaviour,” he says. Looking further ahead, Dr Meye has also gained funding from the ERC to look beyond the dopamine system. “This is kind of a more extended view of what really goes on during stress-eating,” he outlines. “With this ERC grant, we are also going to look at how cortical circuits may control the decisionmaking process in eating behaviour.”

Department of Translational Neuroscience Brain Center Rudolf Magnus Universiteitsweg 100; 3584 CG Utrecht The Netherlands Frank Meye T: +31 88 756 1234 E: f.j.meye-2@umcutrecht.nl W: www.meyelab.net W: https://www.researchgate.net/profile/Frank_Meye : @fjmeye

Frank Meye

Frank Meye has always been fascinated by the ways in which the function of specific neural circuits gives rise to reward seeking, and the ways in which this can become aberrant. During his PhD work (BRCM, Utrecht, The Netherlands) he demonstrated how the midbrain dopamine system (linked to reward seeking) is controlled by specific receptors implicated in feeding behavior. Afterwards, he pursued postdoctoral training at INSERM in Paris, France. There he used state-of-the-art approaches to decipher the neural basis for the aversive cocaine withdrawal state that emerges after abated use, and which is a key contributor to drug relapse. Currently as a young group leader he endeavors to unravel how stressful experiences drive us towards high-caloric unhealthy food choices.

19


Self-healing materials are commonly found in nature, yet man-made materials do not typically have the ability to repair themselves after damage and regenerate their function. We spoke to Professor Wolfgang H. Binder about his research into self-healing and stress reporting, which could help to improve the reliability of a wide variety of materials.

Opening up new horizons in self-healing materials The vast majority

of engineering materials have traditionally been designed with an emphasis on damage prevention rather than damage management, but now the emergence of self-healing materials promises to open up new horizons. Based at the University of Halle in Germany, Professor Wolfgang H. Binder and his research group are investigating self-healing principles, which could be used to improve existing materials. “We are developing concepts to introduce self-healing into these materials,” he outlines. A second aim is to introduce stress reporting capabilities into these materials. “When a material is mechanically deformed or damaged, we want to restore its initial properties. But we also want to see where the damage occurred.” This is an important issue for the aerospace industry in particular, where safety depends to a large degree on rapidly identifying which specific areas on a structure have been subject to high levels of stress. Professor Binder and his colleagues use mechanochemistry to induce a chemical reaction from mechanical stress. “We transform the energy of the stress into a chemical reaction,” he explains. On the one hand this shows where stress has occurred, while it can also help to start the process of

healing any damage. “Stress reporting is very important in certain application areas, such as automotive and aerospace engineering. We want to know where stress has occurred, so that we can essentially visualise it with relative ease.”

The question is how you can introduce a material with which I can achieve self-healing in a living organism. A second point surrounds how self-healing materials can be more effectively implemented in a structure..... Self-healing Many biological materials have self-healing properties, so Professor Binder and his colleagues are drawing a lot of inspiration from the natural world in research. For example, one healing principle that researchers are using is hydrogen bonds, versatile structures which are very common in nature, especially in bio-materials. “When we look at biomaterials – for instance spider silk – we see that the versatility of hydrogen bonds is very important.” Researchers are taking principles from nature and transforming them for technical purposes, although Professor Binder says that they may work differently in synthetic materials. “We aim to understand

Fig. 1: Repeated self-healing with H-bonds : reversible, and multiple self-healing1-7.Copyright Wiley-VCH Verlag GmbH & Co. KGaA. Reproduced with permission.

20

how a hydrogen bond works in a synthetic polymer material.” A hydrogen bond is non-covalent, so it’s relatively weak in comparison to a covalent bond. The bonding energy of a hydrogen bond is also much lower than for a covalent

chemical bond, which is an important consideration in terms of the project’s wider goals. “This means that at normal temperature, this bond can open and close all the time, without us even realising it has happened.” This opening and closing of the bond could be used to induce self-healing in certain classes of materials. “When you cut a material and put the two edges together, the hydrogen bonds find each other on these two different parts of the material, and they reorganise and restructure.” The second self-healing mechanism of interest in the project is based on the use of capsules. In this case, capsules containing an epoxy resin are introduced into a material, which are then used to heal damage when it occurs. “If a material breaks then the capsules are ruptured, and the contents are delivered to the crack site.” Once the resin has been delivered to this location, Professor Binder says that a chemical reaction is then initiated to effectively glue a crack together. “We are using the click reaction, a very efficient and easy reaction, which works at room temperature,” he explains. “We use graphene, together with copper nanosized nanoparticles, as a catalyst. We have developed this in the programme, and we can run the chemistry at room temperature and even below.” This is an important consideration, as while some groups use heat to induce materials to self-heal, the emphasis in Professor Binder’s group is on achieving it at room temperature. The idea is to develop a new concept and incorporate it into existing

EU Research


materials. “The materials we are adding are new, but they are only present in relatively small quantities in the main structure,” he stresses. This will help minimise disruption to industry, which could encourage more companies to consider these new concepts. “For industry, entirely changing a portfolio of materials is often simply too expensive and too risky,” points out Professor Binder. “We are taking existing materials, and adding self-healing and stress reporting properties.” A number of tests have also been developed to assess the effectiveness of these self-healing concepts in terms of regenerating the function of materials. One very simple test involves first cutting a material, then putting the two parts together and allowing it to heal, before making some stress-strain measurements. “So you stress the material and check how strong it is, then you can measure the recovery in percentage terms,” he explains. With a second set of tests the material is stressed, then its colour is monitored by fluorescence spectroscopy. “You check how much chemistry has been produced by this mechanical stress which you have put on the material, then you measure it. So you can quantify the stress.”

The wider context here is the issue of the reliability and performance of engineering materials, which has important financial and environmental implications. While a microcrack on a material may be only nanometres in length, a scale not even be visible to the naked eye, it can develop further and lead to significant problems if left unchecked. “With these micro-cracks, you want to take action before they develop into macrocracks.” Stress reporting therefore needs to be continuous, rather than just in response to specific events. “The system can report the repeated stress on materials. With it, we can understand the history of stress in a material and assess the level of stress it has been subjected to.”

Industrial applications This research has attracted the attention of industry, including automotive and aerospace companies, who are keen to both improve the reliability of materials and reduce maintenance costs. The idea is often not to use self-healing materials throughout a whole structure, but rather in specific parts which may be more vulnerable to damage or scratches, such as the steering wheel of a car. “A German automotive manufacturer is using a coating on the steering

wheel, so that when it is scratched, the scratch is removed by itself after a certain period of time,” outlines Professor Binder. There has also been contact with representatives of the aircraft industry, particularly with respect to optical fibres. “These optical fibres are used to transmit information in an aircraft.” A large volume of information is transmitted between different parts of an aircraft during flight, so it’s essential that these optical fibres function effectively. The optical fibres are bound together with certain parts of the aircraft during the manufacturing process, which Professor Binder says requires careful monitoring. “We want to see whether any of these fibres have been broken during the manufacturing process. For this we need a reporting system which is embedded into the fibres, in order to visualise where any damage has happened,” he explains. This can act as a kind of initial quality check, ensuring that optical fibres are still functioning effectively. “Stress reporting is very important here.” The group’s research could also be relevant to other areas of industry, such as the chemical sector, while there are still many other avenues of research to explore. One major area of interest to Professor Binder is whether self-healing can be incorporated

Fig. 2: Mechanochemical healing : force induced chain breakage and catalyst activation, allowing stress detection and stress-induced repair8-12. The concept can be extended to self-healing epoxy-resin systems13,14, using encapsulated reagents15. Copyright Wiley-VCH Verlag GmbH & Co. KGaA. Reproduced with permission.

www.euresearcher.com

21


MECHANOCHEMICAL AND SUPRAMOLECULAR SELF-HEALING POLYMERS

Fig. 3: Graphene based self-healing allowing stress-location by fluorogenic “click”-reactions9,12,16. Reproduced with permission from the Royal Society of Chemistry, (RSC).

Project Objectives

One of the most outstanding properties of biological materials is their ability to self-heal and regenerate function upon the infliction of damage by external mechanical loads. Man-made materials generally do not have this healing ability, as engineering materials were and are developed on the basis of the ‘damage prevention’ paradigm rather than a ‘damage management’ concept. However, self-healing materials certainly offer enormous possibilities, in particular for applications where long-term reliability in poorly accessible areas, such as tunnels, underground infrastructures, high-rise buildings or space applications, is important. The objective of the Priority Programme is the conceptual design of synthetic self-healing materials and the elucidation of generic, fundamental material-independent principles (e.g. following a sequence of crack generation and propagation, mobility and transport of material, interface bonding and immobilisation of the transported material).

Project Funding

Funding by the European Union’s Seventh Framework Programme for research, technological development and demonstration under Grant Agreement No. 313978 (IASS, “Improving aircraft safety by self healing structures and protecting nanofillers”) as well as Grant Nos. DFG-Bi 1337/8-1 and DFG-Bi 1337/8-2 within the SPP 1568 (“Design and Generic Principles of Self-Healing Materials”) by the “Deutsche Forschungsgemeinschaft (DFG).

into a living system. “The question is how you can introduce a material with which I can achieve self-healing in a living organism,” he outlines. A second point surrounds how self-healing materials can be more effectively implemented in a structure. “This is quite a large step, and I think that 3-D printing is a key technology here,” says Professor Binder. “With 3-D printing technology you can basically introduce a material in any position within your structure, technologically or optically.”

This opens up the possibility that a wider range of materials could have self-healing properties in future. While economic realities mean that the concept is unlikely to be introduced into all materials, he believes that self-healing could bring significant benefits in certain areas. “It makes sense if you have a self-healing tyre for example. A tyre can be used for a longer period, and in this way you can really enhance sustainability,” he points out.

Fig. 4 : Click-chemistry based self-healing nanocomposites15. © Wiley-VCH Verlag GmbH & Co. KGaA. Reproduced with permission.

Contact Details

Professor Wolfgang H. Binder Martin Luther University Halle-Wittenburg Full Professor of Macromolecular Chemistry Faculty of Natural Sciences II von Danckelmannplatz 4, D-06120 Halle T: +49 (0)345 55 25930 E: wolfgang.binder@chemie.uni-halle.de W: http://www.natfak2.uni-halle.de/forschung/ polymers/institute_of_chemistry/binder/?lang=en Professor Wolfgang Binder

(1) Chen, S.; Yan, T.; Fischer, M.; Mordvinkin, A.; Saalwächter, K.; Thurn-Albrecht, T.; Binder, W. H. Opposing Phase-Segregation and Hydrogen-Bonding Forces in Supramolecular Polymers. Angewandte Chemie International Edition 2017, 56, 13016-13020. (2) Chen, S.; Binder, W. H. Dynamic Ordering and Phase Segregation in Hydrogen-Bonded Polymers. Acc. Chem. Res. 2016, 49, 1409-1420. (3) Chen, S.; Mahmood, N.; Beiner, M.; Binder, W. H. Self-Healing Materials from V- and H-Shaped Supramolecular Architectures. Angew. Chem., Int. Ed. 2015, 54, 10188-10192.

Wolfgang Binder is Professor of Macromolecular Chemistry at Martin Luther Universitat, Halle-Wittenberg. His research centres around the preparation of functional polymers and the transfer of the generated molecules into areas of biomimetic polymers, self-healing polymers and nanostructured materials.

(4) Yan, T.; Schröter, K.; Herbst, F.; Binder, W. H.; Thurn-Albrecht, T. What Controls the Structure and the Linear and Nonlinear Rheological Properties of Dense, Dynamic Supramolecular Polymer Networks? Macromolecules 2017, 50, 2973-2985. (5) Yan, T.; Schröter, K.; Herbst, F.; Binder, W. H.; Thurn-Albrecht, T. Unveiling the molecular mechanism of self-healing in a telechelic, supramolecular polymer network. Scientific Reports 2016, 6, 32356. (6) Herbst, F.; Binder, W. H.: Self-healing polymers via supramolecular, hydrogen bonded networks. In Self Healing Polymers: from Principles to Application; Binder, W. H., Ed.; Wiley-VCH Verlag GmbH & Co. KGaA: Weinheim, 2013; pp 275-300. (7) Herbst, F.; Seiffert, S.; Binder, W. H. Dynamic supramolecular poly(isobutylene)s for self-healing materials. Polym. Chem. 2012, 3, 3084-3092. (8) Michael, P.; Biewend, M.; Binder, W. H. Mechanochemical Activation of Fluorogenic CuAAC “Click” Reactions for Stress-Sensing Applications. Macromolecular Rapid Communications 2018, 0, 1800376. (9) Döhler, D.; Michael, P.; Binder, W. H. CuAAC-Based Click Chemistry in Self-Healing Polymers. Accounts of Chemical Research 2017, 50, 2610-2620. (10) Michael, P.; Binder, W. H. A Mechanochemically Triggered “Click” Catalyst. Angew. Chem., Int. Ed. 2015, 54, 13918-13922. (11) Hu, M.; Peil, S.; Xing, Y.; Döhler, D.; Caire da Silva, L.; Binder, W. H.; Kappl, M.; Bannwarth, M. B. Monitoring crack appearance and healing in coatings with damage self-reporting nanocapsules. Materials Horizons 2018, 5, 51-58. (12) Döhler, D.; Rana, S.; Rupp, H.; Bergmann, H.; Behzadi, S.; Crespy, D.; Binder, W. H. Qualitative sensing of mechanical damage by a fluorogenic “click” reaction. Chem. Commun. 2016, 52, 11076-11079. (13) Guadagno, L.; Vertuccio, L.; Naddeo, C.; Calabrese, E.; Barra, G.; Raimondo, M.; Sorrentino, A.; Binder, W. H.; Michael, P.; Rana, S. Self-healing epoxy nanocomposites via reversible hydrogen bonding. Composites Part B: Engineering 2019, 157, 1-13. (14) Raimondo, M.; Nicola, F. D.; Volponi, R.; Binder, W.; Michael, P.; Russo, S.; Guadagno, L. Self-repairing CFRPs targeted towards structural aerospace applications. International Journal of Structural Integrity 2016, 7, 656-670. (15) Rana, S.; Döhler, D.; Nia, A. S.; Nasir, M.; Beiner, M.; Binder, W. H. “Click”-Triggered Self-Healing Graphene Nanocomposites. Macromol. Rapid Commun. 2016, 37, 1715-1722. (16) Shaygan Nia, A.; Rana, S.; Döhler, D.; Osim, W.; Binder, W. H. Polymer 2015, 79, 21-28.

22

EU Research


A deeper picture of the nitrogen cycle Nitrogen is indispensable to life on earth, yet its reactive forms can also have harmful effects on the environment when present in excessive amounts. We spoke to Professor Cláudia Marques-dos-Santos Cordovil about the work of the NitroPortugal project in strengthening the country’s research base and laying the foundations for continued investigation into the nitrogen cycle. The nitrogen cycle

has changed dramatically over recent history, which has been linked to wider environmental problems, such as increased water and air pollution and loss of biodiversity. This has prompted a renewed focus on research; the European Nitrogen Assessment (ENA) was established in 2011 to look at the major issues surrounding nitrogen losses, out of which some key concerns were identified. “The ENA came up with five major concerns, described as WAGES. These are water, air, greenhouse gases, ecosystems and biodiversity, and soils. This is where we have to focus our efforts to reduce nitrogen losses,” says Cláudia Cordovil, a

Professor in the School of Agronomy of the University of Lisbon. This is an increasingly urgent priority, with growing concern around the impact of changes to the nitrogen cycle. Nitrogen forms several kinds of reactive compounds which can be transformed relatively easily and trickle into different parts of the environment, in what has been described as the nitrogen cascade. “This is why it’s so difficult to control nitrogen use, and why it’s so difficult to improve nitrogen use efficiency,” explains Professor Cordovil. Climatic and soil conditions have a major influence on the extent of nitrogen losses, so Professor Cordovil believes it’s important

to develop a deeper picture of how the nitrogen cycle is changing in Portugal. “The situation in southern Europe is not the same as in other parts of Europe,” she points out.

NitroPortugal project A lot of the measurements on which wider European legislation around nitrogen management and emissions are based have historically been made in central and northern Europe however, despite these regional differences. Now Professor Cordovil and her colleagues in the NitroPortugal project are looking at the situation in Portugal in greater depth, while also aiming

People normally work on very specific areas. For example, soil scientists work on soil samples, biodiversity scientists work mostly with their populations, and people working on water issues are concerned about the level of nitrate contamination, to give a simple example.

www.euresearcher.com

23


to strengthen the country’s research base in this area. “We’re looking at it from a holistic point of view,” she explains. “We are looking at these different parts of the environment described in WAGES, and investigating how excessive reactive nitrogen in the environment can affect them.” This is a relatively neglected area of research, believes Professor Cordovil, particularly in view of its environmental importance. The carbon cycle has attracted a lot of attention in research as awareness of climate change has grown over recent decades, yet nitrogen has not reached the same level of prominence. “I think that nitrogen has not been given the same level of attention as carbon has,” says Professor

The four Principal Investigators of the NitroPortugal project.

24

Cordovil. Nitrous oxide (N2O) is in fact a much stronger greenhouse gas than CO2, yet the latter figures much higher in the public consciousness; this is an imbalance Professor Cordovil and her colleagues are working to address. “We aim to heighten awareness about the need to improve nitrogen use efficiency,” she explains. The foundation of this work is a deep knowledge of nitrogen and how to increase the nitrogen use efficiency to reduce current reactive nitrogen losses in forms that find their way into different parts of the environment. While nitrogen is essential to life, excessive amounts of the reactive forms in the environment can have a negative impact. “On the one hand,

nitrogen is absolutely indispensable to life. It is 78 percent of the air that we breathe and it plays a central part in processes like DNA formation. But on the other hand if there are excessive amounts of the reactive forms in the environment, then that can cause problems,” explains Professor Cordovil. A prime example is the impact of excess reactive nitrogen on air pollution. Evidence suggests that reactive nitrogen is linked to the formation of particulate matter, fine particles in the atmosphere that can affect the heart and lungs, increasing the occurrence of respiratory diseases. “This has a serious impact on public health,” points out Professor Cordovil. Over recent years, the amount of reactive nitrogen in the environment has increased across the world, underlining the importance of the project’s work. “Our idea is to continue the work that was done previously in Nitrogen Europe and other initiatives,” says Professor Cordovil. “We want to strengthen collaboration between institutions, and to increase Portuguese participation in European nitrogen research.” There is also a strong emphasis in the project on sharing expertise and knowledge across different disciplines, which has not always been the case previously, with researchers often working in highly specialised areas. Nitrogen is of interest across many different disciplines, yet researchers do not typically work together and share data to build a deeper picture. “People normally work on very specific areas. For example, soil scientists work on soil samples, biodiversity scientists work mostly with their populations, and people working on water issues are concerned about the level of nitrate contamination, to give a simple example” explains Professor Cordovil. This is an issue that the project aims to address by strengthening collaboration, identifying gaps in knowledge, and encouraging scientists to share their findings. While there are some research activities in the project, the main focus is on strengthening Portugal’s research capacities in this area and heightening awareness of the issues around nitrogen losses more generally. “We’re trying to reach the general public, and we have also attracted a lot of interest from the farming sector,” says Professor Cordovil. A lot of attention in the project is focused on the agricultural sector in particular. “The agricultural sector is one of the major contributors to nitrogen losses due to the use of mineral fertilisers and manures for example,” outlines Professor Cordovil.

EU Research


NITROPORTUGAL Strengthening Portuguese research and innovation capacities in the field of excess reactive nitrogen

Project Objectives

To understand the state of knowledge and environmental status regarding nitrogen in Portugal. To bring Portugal up front in the international Nitrogen research, dissemination and policy arena.

Project Funding

NitroPortugal is H2020-TWINN-2015 Coordination & support action (nr 692331).

Project Partners

The NitroPortugal Team

Agricultural sector These types of practices have become common in farming, yet there are more sustainable alternatives, and Professor Cordovil says that part of the project’s agenda involves educating farmers around the importance of effective nitrogen management. “Farmers need to understand the benefits of doing proper fertiliser management and proper nitrogen management,” she outlines. There has already been a shift in this general direction over the last few years, with the

urea fertiliser, is looking towards producing fertilisers using only new neem coated urea, which helps to reduce nitrogen losses,” she says. By strengthening the Portuguese research base, Professor Cordovil aims to help keep the nitrogen cycle in the forefront of public attention. “It is about raising awareness of nitrogen issues, both in policy makers and in the general public,” she says. The public have an important role to play in these terms by changing behaviour and encouraging companies to move

• Instituto Superior de Agronomia (School of Agronomy), University of Lisbon, Portugal, PI Cláudia M. d. S. Cordovil • Faculty of Sciences of the University of Lisbon, Portugal PI Cristina Branquinho • Centre of Ecology and Hydrology, Natural Environment Research Centre, UK, PI Mark Sutton • Aarhus University, Denmark, PI Tommy Dalgaard

Contact Details

Project Coordinator, Professor Cláudia Marques-dos-Santos Cordovil (PhD) Instituto Superior de Agronomia da Universidade de Lisboa Tapada da Ajuda, 1349-017 Lisboa Portugal T: +35 1 21 365 3424 E: cms@isa.ulisboa.pt W: http://www.isa.ulisboa.pt/proj/nitroportugal/ Cláudia Marques-dos-Santos Cordovil

We came up with five major concerns, which are

water, air, greenhouse gases, ecosystems and biodiversity, and soils - WAGES. This is where we have to focus our efforts to reduce nitrogen losses and to improve nitrogen use efficiency. re-emergence of some traditional farming practices. “Some of the old practices that our grandparents used to do are gaining importance again in agriculture, for example crop associations are being looked at carefully again,” continues Professor Cordovil. A number of other techniques are also available to help reduce nitrogen losses, such as urea coatings and nitrification inhibitors, yet there is still scope for further improvement. There is a growing international awareness of the impact of nitrogen losses, and Professor Cordovil says some countries are taking action. “For example India, which is a major producer of

www.euresearcher.com

Cláudia Marques-dos-Santos Cordovil is Professor at the University of Lisbon, School of Agronomy. Working on Nitrogen, soil fertility and plant nutrition, recycling of organic wastes for nutrient recovery. Collaborates with international partners in science communication on nitrogen issues. Co-chair of the Task Force on reactive Nitrogen of the UNECE.

towards more sustainable practices, which starts with education. A higher level of public awareness is also central to continued research in this area in future, both in Portugal and elsewhere. “The example of the NitroPortugal could be followed by countries which perhaps don’t have a very strong tradition of research in this area,” outlines Professor Cordovil. This will help to lay the foundations for further research and international collaborations. “We are starting to work on nitrogen assessments with Spanish colleagues. We hope to prepare the ground for a new approach in this area,” says Professor Cordovil.

25




Balancing risk and reward in the financial system Over ten years on from the financial crisis, debate continues about how the financial industry can be regulated effectively, balancing the goal of ensuring banks can make profits while also maintaining the stability of the financial system. We spoke to Professor Gerhard Illing about his research into the relationship between monetary policy and banking behaviour. Many central banks across the world responded to the financial crisis of 2008 by injecting liquidity into the market to prevent insolvency arising from illiquidity. While this has helped to maintain market stability, it may also be encouraging banks to continue operating in a similar way as before the crisis. “Banks may anticipate that in a systemic crisis, the central bank will be willing to provide lender of last resort activities, aiming to dampen the impact of systemic risk. This gives private banks an incentive to engage in excessive risk-taking,” explains Gerhard Illing, Professor of Economics at the University of Munich. This topic is at the heart of Professor Illing’s project on illiquidity, insolvency and banking regulation, in which researchers are investigating the feedback between monetary policy and risk-taking in the financial industry. “We are looking at how the actions of central banks affect risk taking. Over recent years central banks have supported financial institutions by providing liquidity and lowering interest rates, even below zero,” he outlines.

Illiquidity and insolvency This has encouraged banks to increase their leverage and invest in illiquid projects, which

28

may leave them vulnerable should a financial crisis occur. In the years leading up to 2008 many banks increased their financial exposure to assets which they viewed at the time as highly liquid, being traded on the interbank market. “But when the crisis hit, the liquidity of these assets suddenly dried out,

The 2008 crisis exposed the overdependence of financial institutions on shortterm funding, driven to a large extent by the abundant availability of public liquidity. A prime example is British company Northern Rock; although a mortgage provider, it had hardly any deposits in the traditional

In a systemic crisis, the central bank will provide lender of last resort activities, helping to dampen the impact of systemic risk. The problem is that banks will anticipate this reaction, giving them strong incentives to engage in excessive risk-taking. Ex ante liquidity regulation is needed to counter that incentive. and their value dropped dramatically,” says Professor Illing. Regulators have responded by imposing strict liquidity requirements in the Basel III framework, aiming to prevent the accumulation of excessive risk. “Our research provides a theoretical justification for Basel III rules imposing liquidity constraints and regulations, which incentivize banks to invest more in liquid assets,” continues Professor Illing. “The value of these assets is more stable than illiquid assets, so their value does not fall dramatically during a crisis.”

sense at the time the financial crisis hit. “Northern Rock mainly got their liquidity either from other banks, or from money market funds,” explains Professor Illing. This worked for a period, but the onset of the financial crisis revealed inherent problems, eventually leading to a run on the bank. “As distrust among banks grew, the inter-bank lending market essentially dried up and Libor (London Inter-Bank Offered Rate) shot up to

EU Research


unprecedented levels,” says Professor Illing. “Northern Rock were cut off from lending, because they relied too much on short-term liquidity in the inter-bank market.” This led to a dramatic deterioration in their position, and after providing tens of billions of pounds in liquidity support, the government eventually had to nationalise the bank. This was an expensive process, but helped to prevent a systemic meltdown, which Professor Illing says would have had serious consequences. “Insolvency, on the aggregate level, is highly costly. So for that reason there is a strong motivation for central banks to provide liquidity support,” he outlines. The challenge for regulators is in both anticipating when a financial crisis may occur, and also taking measures to effectively reduce the impact of a crisis if it does. “If you take measures during a crisis to mitigate the effects, then this effectively encourages bankers to be less careful,” points out Professor Illing. “So you need to try and mitigate the buildup of a crisis ex ante – and then during a crisis, also take measures to dampen its impact.”

Regulatory framework The aim for regulators now is to learn from the mistakes of the past and develop a more effective regulatory framework, that both gives banks the commercial freedom they need to stay profitable, while also discouraging excessive risk-taking. The project will make an important contribution in this respect, with Professor Illing and his colleagues contributing to the design of a new framework for macro-prudential regulation. “It allows us to model the behaviour of banks subject to regulatory constraints, and to model the response of banks to regulatory constraints,” he outlines. The behaviour of banks here is modelled endogenously, as it is subject to change as market circumstances evolve. “We evaluate non-linear dynamics, as banks endogenously change their asset structure and their dividend payment process during a crisis,” says Professor Illing.

www.euresearcher.com

This provides foundations to analyse the likely impact of different regimes, such as changes in capital or liquidity regulations, which holds important implications for financial policy. Changes to the Basel III regulations have recently been proposed, which Professor Illing and his colleagues plan to evaluate during the project, part of the wider goal of improving financial regulation. “We aim to provide more evidencebased insights, for example in stress testing. This is a very challenging topic mathematically,” he continues. “Until now, the focus has been on micro-prudential stress-testing, rather than macro-prudential stress testing. That’s something we are interested in looking into in future.” The nature and extent of the financial regulation regime is a matter of ongoing debate. “During good times, it’s hard for policy-makers to resist the pressure to relax regulation, because everything seems to be running smoothly, with strong pressure to argue that regulation hinders growth,” explains Professor Illing. His research indicates that credit booms triggered by financial liberalization may actually push the economy into persistent stagnation, seducing borrowers into unsustainable consumption spending, driving them into a debt overhang.

ILLIQUIDITY, INSOLVENCY, AND BANK REGULATION Illiquidity, Insolvency, and Bank Regulation Project Objectives

The provision of central bank liquidity as lender of last resort is seen to have contributed to incentives for excessive risktaking in the financial industry. Researchers in the project analyse the feedback between monetary policy and risktaking of financial intermediaries, aiming to contribute to the design of a new framework for macro-prudential regulation. In the project, researchers model the impact of liquidity provision by central banks on incentives of financial intermediaries to engage in activities creating systemic risk.

Project Funding

This project is funded by the German Research Foundation (DFG).

Contact Details

Professor Gerhard Illing Ludwig-Maximilians-Universität München Seminar für Makroökonomie Ludwigstr. 28 / 012 (Rgb.) 80539 München T: +49 (0) 89 / 2180 - 2126 E: illing@econ.lmu.de W: https://www.sfm.econ.uni-muenchen. de/personen/professor/illing/index.html

Priority program The project itself is part of a larger priority program “Financial Market Imperfections and Macroeconomic Performance” funded by the German Research Foundation (DFG), focusing on the link between macroeconomics and financial economics. The program is comprised of over 20 projects, analyzing the effect of financial market imperfections on financial market stability, macroeconomic volatility, and long-run economic growth. “Various international conferences have been held as part of the programme,” says Professor Illing. This has encouraged the exchange of ideas and led to new collaborations, which Professor Illing hopes will stimulate further research in the years ahead.

Professor Gerhard Illing

Gerhard Illing is a Professor of Economics at Ludwig-Maximilian University of Munich. His main research areas are monetary theory, financial stability, systemic risk and lender of last resort policy. He is CESifo Research fellow and member of the German Economic Association, active in the areas of Monetary Economics and Macroeconomics.

29


How to respond to a financial crisis A number of different fiscal and monetary policy instruments were adopted in the aftermath of the 2008 financial crisis, as governments and central banks aimed to keep the cogs of the economy moving. We spoke to Professor Andreas Schabert about his work in assessing the effectiveness of these policy instruments. The financial crisis of 2008 posed an enormous challenge to governments and central banks across the world as several major financial institutions teetered on the edge of collapse, threatening to undermine the foundations of the global economy. Various different monetary and fiscal policy instruments are available to help maintain financial stability in these kinds of circumstances, a topic central to Professor Andreas Schabert’s research. “The main question in my project is; what monetary and fiscal policy instruments are particularly useful in times of crisis?” he outlines. Some of the instruments applied during and after the 2008 crisis had not been used before, so Professor Schabert believes it’s important to build a stronger evidence base. “For example, in 2008-9 the US Federal Reserve bought mortgage-backed securities (MBS) in large volumes,” he says.

MONETARY AND FISCAL POLICY IN TIMES OF CRISIS This project examines the effectiveness of unconventional and conventional monetary and fiscal policy measures that were conducted during the recent financial crisis. Examples are asset purchases and forward guidance conducted by the central bank or government spending programs. The project further examines financial regulation and the interaction with monetary policy. Professor Andreas Schabert University of Cologne Center of Macroeconomic Research AlbertusMagnus-Platz 50923 Cologne, Germany T: +49 172 2674482 E: schabert@wiso.uni-koeln.de W: https://cmr.uni-koeln.de/de/ team/senior-faculty/schabert/)

Andreas Schabert is a Professor of Economics at the Centre of Macroeconomic Research, part of the University of Cologne. His main research interests are monetary policy, fiscal policy, financial markets and international macroeconomics. He was a Duisenberg Fellow at the European Central Bank in 2014 and regularly contributes papers.

30

Or are they doing even more harm?” he asks. Many economists think for example that the ECB’s interventions in European markets for government debt have been very helpful. “The ECB was successful in stabilising interest rates in accordance with its mandate, in particular for those countries which had before seen a very high interest rate premium on their debt,” explains Professor Schabert.

Fiscal policy Chart of development of central bank balance sheets over the period between January 2007 and approximately January 2014.

Monetary policy This was in response to concerns about the falling market price of MBS, a decline attributable to problems in the US mortgage debt market. This in turn led to problems for lenders, who re-financed by issuing these MBS. “When the price of these MBS slumped, re-financing of mortgage lenders became more and more critical,” explains Professor Schabert. The US Federal Reserve eventually stepped in and bought these MBS, and thereby stabilised the price. “In principle, this was a rescue instrument. The intention was to make sure that lenders could re-finance in a regular way,” continues Professor Schabert.

A variety of fiscal policy options are also available in times of crisis. One option is to increase spending to spur demand in the markets, so that overall production and income in the economy is stimulated. “That’s a traditional view on government spending,” says Professor Schabert. In the years following the financial crisis, many governments put in place huge spending programmes; the impact of this, considering the economic context, is a matter of debate. “Many economists have argued that these spending programmes are over-proportionally successful and helpful, when short-term rates are at the zero-lower bound (ZLB),” explains Professor Schabert. “In this context, the impact of monetary policy on the effectiveness of fiscal policy is overestimated, since longer-term rather than short-term rates are relevant for consumption and saving decisions.”

The main question in my project is; what monetary and fiscal policy instruments are particularly useful in times of crisis? “This was an exceptional measure, it had never been done before in the US and it has not been necessary since 2008, when prices stabilized.” The role of a central bank is traditionally thought to be supplying central bank money or changing money market interest rates, which have only indirect economic effects. When a central bank considers using a particular tool, such as an asset purchase programme, Professor Schabert says it’s important to consider how it will affect asset prices. “Are these types of interventions really helpful? Do they really support a market which is not functioning?

Further research in this area could underpin more effective financial regulation in future, while Professor Schabert hopes that the project’s work will lead to a wider acceptance of the need for interventions in financial markets. Over recent years new monetary and regulatory instruments have emerged, while the idea of macro-prudential regulation has also gained prominence. “We have to account for the economy-wide effects of regulation and the coordination of regulatory measures with monetary policy instruments. In the end, these policy tools jointly alter the conditions in various financial market segments,” says Professor Schabert.

EU Research


Unravelling the Earth’s history beneath the ocean floor The history of the Earth can be found beneath the floor of our oceans where rocks and sediments act as a natural library of past events. ECORD supports ocean drilling, recovering these precious archives, which give scientists a window into the past, discerning patterns and reading signals that will give us advance warning of future changes and emerging societal challenges, as Dr Gilbert Camoin explains. The rocks and

sediments beneath the ocean floor represent a vast archive of information about the history and evolution of the Earth, yet retrieving samples is by nature extremely challenging. The European Consortium for Ocean Research Drilling (ECORD) brings together 15 nations and is one of three platform providers, together with the USA and Japan, which give researchers the opportunity to participate in expeditions as part of the International Ocean Discovery Program (IODP). ECORD plays an important role in this respect, supporting drilling expeditions to different parts of the world. “The three platform providers are independent, but they work together to achieve the scientific objectives of the programme,” explains Dr Gilbert Camoin, the European consortium’s director. The IODP platform providers run expeditions to recover sediments and rocks from the seafloor and below, as well as to collect subseafloor fluids, microbes, and geophysical and geochemical data by instrumenting boreholes. Networks of

boreholes can be used for active experiments to resolve important properties and processes. These expeditions are driven by science for science and society globally, addressing a wide range of research areas, encompassing fundamental issues affecting the planet: climate change, biodiversity and geohazards, including volcanic eruptions, earthquakes and tsunamis. “Ocean drilling is important to each of these different topics, as well as many others,” stresses Dr Camoin. Each of the platform providers offers facilities for ocean drilling, yet Dr Camoin says that ECORD is distinct from the American and Japanese programmes. “The American and Japanese platform providers have fixed facilities, ships with drilling and logging equipment. But we are ready to hire the right drilling or coring system for a particular scientific proposal, on an expedition-byexpedition basis,” he continues. “ECORD conducts mission-specific platform expeditions across the world. We can reach previously inaccessible areas and go where no scientific drilling project has gone before.”

ECORD expeditions The first ECORD expedition in 2004 to the Arctic basin gathered material from the Lomosonov Ridge, an underwater mountain chain located around 1,000 metres below the sea surface. A number of further expeditions have since been conducted, in locations as diverse as Tahiti, the Gulf of Corinth, the Chicxulub Impact Crater offshore Mexico, and the Great Barrier Reef. The scientific objectives of these different expeditions have varied, with some requiring shallow water drilling at depths of 20 to 60 metres, while Dr Camoin says others have gathered material from much deeper. “The record in the IODP programme in terms of water depth is in Japan, where we drilled at a depth of more than 8,000 metres,” he says. Drilling at these kinds of depths is extremely challenging, yet modern tools and infrastructure are up to the task. “We have diversified the drilling capabilities, and we are able to work more or less everywhere with ECORD,” continues Dr Camoin.

Image Right: Working deck of the Fugro Synergy drilling vessel (© D. Smith, ECORD-IODP). Image Below: Fugro Synergy drilling vessel at the start of IODP Expedition 381 ‘Corinth Active Rift Development’ operated by ECORD in October-December 2018 (© R. Gawthorpe, ECORD-IODP).

Rotary coring bit with core catcher (© D. Smith, ECORD-IODP).

www.euresearcher.com

31


This represents a great opportunity for scientists to gather interesting material, from which new insights can be drawn. Ocean drilling is crucial to a number of major contemporary concerns which are also set to affect future generations, such as climate change, biodiversity and geohazards, so Dr Camoin says a lot of emphasis is placed within the programme on training young scientists and heightening awareness. “We want to provide young scientists with the opportunity to participate,” he stresses. Around 50 percent of the scientists sailing on the expeditions are early-career scientists, part of a wider commitment to training the next generation and equipping them with the skills they will need to push the research agenda forward. “This is an achievement we’re really proud of. We want to expose the younger generation to outstanding science,” says Dr Camoin. The next ECORD expedition is set to leave in September 2019, in which researchers plan to gather samples from a succession of drowned coral reefs around Hawaii, at depths of between 134 and 1,115 metres, with the goal defining the nature of sea-level change in the central Pacific over the last 500,000 years. Specialists in certain fields, including sedimentology, geochemistry, palaeontology

and microbiology are required to achieve the expedition’s scientific goals. “We recruit scientists with specific expertise to ensure that the scientific objectives of the expedition will be met,” explains Dr Camoin. The application process is however open to researchers from other disciplines beyond those specifically mentioned in the documentation. “We want to open the application process up as much as possible to different disciplines, to make sure that we can extract as much science as we can from the cores and samples that we gather

the rocks and to reconstruct depositional environments. There are also a range of petrographical, mineralogical, geophysical and geochemical techniques,” explains Dr Camoin. Dating the samples is of course an important step in terms of building a fuller picture, while further scientific insights can also be gained. “We extract the core fluids from the sediments to analyse their chemical condition. We also study fluids circulating through the ocean floor,” continues Dr Camoin.

If you want to understand the background to climate change, you have to understand what happened in the past concerning the climatic evolution of different regions. during the expedition,” says Dr Camoin. A variety of analytical techniques can be applied on the cores once they have been retrieved from the ocean, as scientists look to gain new insights and deepen their understanding. Different tools are available, depending on the precise nature of the sediments or rocks. “There is sedimentology, which is used to look at the physical properties of the sediment or

CoreWall system with CT scans and cores (© E. Le Ber, ECORD-IODP).

The cores collected by ECORD may be partly analysed on board the vessel, and then sent for further investigation to the very well-equipped Bremen Core Repository (BCR), one of three IODP core repositories where all drilled cores and samples gathered over the last 50 years are stored. Once an expedition has concluded, the data gathered is eventually made more widely available to the scientific

Various drill bits used for drilling different lithologies (© ECORD-IODP).

Core catcher sample (© A. Rae, ECORD-IODP).

32

EU Research


community for further analysis. “The results are not restricted to the scientists who sailed on the expedition, they’re open to other researchers,” stresses Dr Camoin. This not only helps researchers to extract all the possible scientific insights from the cores that have been gathered, but also to heighten awareness of the IODP and its wider importance to research, such as in investigating climate change. “If you want to understand the background to climate change, you have to understand what happened in the past concerning the climatic evolution of different regions, in particular at the high latitudes, both the Arctic and the Antarctic,” says Dr Camoin. This is of course a prominent concern today, and continued research is essential to informing the ongoing debate about climate change and the likely extent of any future rise in sea levels. Analysis of sediment cores and sensor data from below the seafloor can also lead to important insights in a number of other areas, including seismology, volcanic hazards and microbes in the ocean crust, underlining the wider scientific importance of ocean drilling. “The IODP aims to gather data relevant to both fundamental and applied issues facing

society,” explains Dr Camoin. Alongside the scientific expeditions, ECORD also runs educational activities, aiming to help train the next generation of researchers. “We organise summer schools, and we pay the fees and travel costs for students to attend,” continues Dr Camoin. “There are talks from researchers, and students also do practical work on dealing with cores. It’s a really thorough training course.” This also serves to heighten awareness of the ocean drilling programme among stakeholders and the general public, which is crucial to securing continued funding, and hence its long-term future. Beyond the expedition to Hawaii, further expeditions are planned. “In 2020, we will work with our Japanese colleagues and implement an expedition jointly to offshore Japan, to core sediments related to different historical earthquakes. We will develop a deep-sea record of earthquakes, and we will look to see how far back in time we can extend it,” says Dr Camoin. The next major expedition after that is likely to be in 2021, when Dr Camoin says the plan is to return to the Arctic Ocean. “We aim to investigate the climate history of the Arctic Ocean over the last 55 million years,” he outlines.

View from the drill rig used during IODP Expedition 325 ‘Great Barrier Reef Environment’ operated by ECORD in March-April 2010 (© D. Smith, ECORD-IODP).

ECORD

European Consortium for Ocean Research Drilling Project Objectives

The European Consortium for Ocean Research Drilling (ECORD), is a management structure of 15 member countries as part of the 2013-2023 International Ocean Discovery Program (IODP). Through scientific ocean drilling, the IODP science addresses a wide range of fundamental and applied issues for society, such as Climate and Ocean Change, Biodiversity and Origin of Life, Earth in Motion and Earth Structure and Dynamics.

Project Funding

$17.5 Million per year.

Project Partners

Österreichische Akademie der Wissenschaften (Austria), University of British Columbia (Canada), Danish Agency for Science, Technology and Innovation (Denmark), Academy of Finland (Finland), Centre National de la Recherche Scientifique (France), Deutsche Forschungsgemeinschaft (Germany), Geological Survey of Ireland (Ireland), Consiglio Nazionale delle Ricerche (Italy), Netherlands Organisation for Scientific Research (The Netherlands), Research Council of Norway (Norway), Fundação para a Ciência e a Tecnologia (Portugal), Ministerio de Economia y Competitividad (Spain), Swedish Research Council (Sweden), Fonds National Suisse de la Recherche Scientifique (Switzerland), United Kingdom Research and Innovation (United Kingdom)

Contact Details

Dr Gilbert Camoin Director of the European Consortium for Ocean Research Drilling (ECORD) Managing Agency CEREGE-CNRS, Europôle Méditerranéen de l’Arbois, BP 80, 13545 Aix-en-Provence, France T: +33 4 42 97 15 14 E: camoin@cerege.fr W: www.ecord.org Dr Gilbert Camoin

Drill cores stored at the Bremen Core Repository, MARUM, University of Bremen, Germany (© V. Diekamp, ECORD-IODP).

Dr Gilbert Camoin is Senior Research Scientist at the CNRS-CEREGE, Aix-enProvence, France. His major scientific interests concern the records of sea level and environmental/climatic changes by reef systems and the impact of such changes on carbonate systems. Since 2012, he is managing the European Consortium for Ocean Research Drilling (ECORD), the European participation to the International Ocean Discovery Program.

www.euresearcher.com

33


Desertification: A Threat to Land and Life In June 2018, the Joint Research Centre, the European Commission’s science and knowledge service, announced that ‘there was unprecedented pressure on the planet’s resources’, verified by the publication of their New World Atlas of Desertification. In this article, we take a look at what the new analysis revealed, what the data means and how, if at all, we can reverse some of the damage. By Richard Forsyth

D

esertification, the process where land becomes desert, affects 8% of the EU, impacting Southern, Eastern and Central Europe, in areas comprising 14 million hectares of land. There are 13 Member States directly affected by desertification, namely; Bulgaria, Croatia, Cyprus, Greece, Hungary, Italy, Latvia, Malta, Portugal, Romania, Slovakia, Slovenia and Spain. The economic cost of soil degradation for the EU is estimated to be in the order of tens of billions of euros annually. As anyone in Europe exposed to the scorching summer of 2018 will know, the heatwave was a sustained event, labelled an anomaly for its extremes, which led many countries unused to this kind of relentless temperature to get to grips with heat related hardships. Sweden received only 12% of their normal rainfall, with wildfires engulfing large areas. Poland, Belarus, Czech Republic, Scotland, Ireland, Germany and the Netherlands were all hitting the headlines for facing unprecedented droughts and for the subsequent dire repercussions for crops and farms. You could say it was a wake-up call to appreciate the serious effects of climate impact and a taste of a problem that is creeping out of control for populations worldwide. The impacts associated with droughts, give us a taste of how bad desertification can be, and obviously, this is a challenge not limited to Europe’s borders. For the planet, desertification poses a serious, urgent problem, one that is getting worse over time. Without intervention, these useless, arid wastelands will make life on Earth tougher in a myriad of different ways. Desertification leaves once fertile soil unable to support useful vegetation or self-sustaining ecosystems, and for people living in these redundant arid zones, it can lead to misery, mass migration, economic ruin and in some cases, death.

Deserts are economy killers There are about 1 billion people in 100 countries whose livelihoods are threatened by desertification. Many of the populations in these dry zones are made up of the poorest and most vulnerable people. Indeed, the Atlas reports there is often ‘a destabilising link between poverty and overexploitation of biodiversity in most, if not all

34

dryland ecosystems.’ Where the only fuel source for communities is woody plants, the areas around theses villages and cities can become devastated as shrubs are removed up to a dozen kilometres around the population. The Atlas adds that where charcoal replaces fuelwood, as a more easily transported fuel, the consequence is that numerous provisioning ecosystem services of the former woody vegetation, including food, medicine, building material, fibres, crafts, fertiliser, etc. are depleted.

Progressing toward desolation ‘Humans and their actions have become the main driver of global environmental change’, summarises the Atlas. Human activities are without doubt, causing desertification and threatening biodiversity. Not least with the intentional destruction of forest habitats to make way for ranches, agriculture, mining and logging. Last year saw 39 million acres of trees destroyed. A fifth of the Amazon has already been stripped and exploited for resources. It’s believed that between 100-200 species of plant, insect, bird and mammal vanish forever, every 24 hours, a statistical indicator of the mass extinction age we are currently in, where the extinction pace is accelerated 1,000 times faster than what’s considered the natural order. The Atlas confirms: ‘The world is currently losing species at an unprecedented rate. Most of this species loss can be directly attributed to human activities. Land conversion, for instance from natural woodlands to agricultural crop fields, has, over the past few decades, been a major threat to biodiversity. As habitats degrade they become less able to support biodiversity, with badly degraded habitats typically having a reduced biodiversity, or a biodiversity that is shifted to early successional species. Heavy overgrazing destroys perennial grass species that are replaced with annual grasses and weedy forbs, including exotic invasive species. This increases the rates of soil erosion and provides less palatable grazing for livestock.’ The last point illustrates how intensive short-term exploitation of land for agriculture can be ultimately self-defeating in purpose, as the land for farming becomes unusable. On a global scale this has been

EU Research


recognised as a significant challenge to sustaining food security. Around 44% of the world’s agricultural land is located in drylands, mainly in Africa and Asia, and supplies about 60% of the world’s food production. Most of this production has been achieved through the innovations like improved seeds, chemical fertilisers, enhanced technologies and irrigation. But times and climates are constantly changing and so the challenges are intensifying. Currently, over 75% of the Earth’s land area is already degraded, and over 90% could become degraded by 2050. Meanwhile supply must meet demand in terms of yields. Food production needs more and more land to sustain supply. The Atlas indicates that: ‘Agricultural production must continue to meet the needs of a rapidly growing global population. One estimate is that over 1 billion hectares of “wild” land will have to convert to agriculture to feed the global population by 2050. However, not all land is suitable for agriculture and there is intense and increasing competition for land due to urbanisation, bioenergy farming, forest plantations and protected areas.’ It’s an equation heading toward a tipping point, a kind of ‘peak farming’ moment in the future. As more agricultural land is needed, more land is becoming unusable. Globally, a total area half of the size of the European Union (4.18 million km²) is degraded annually, with Africa and Asia being the most affected. Land degradation and climate change are estimated to lead to a reduction of global crop yields by about 10% by 2050. Most of this will occur in India, China and sub-Saharan Africa, where land degradation could halve crop production. This will be at a time when the global population will be increased to around the 10 billion mark.

Chains of events The facts paint a bleak picture of degradation and depleted resources, where food security and biodiversity will be seriously impacted. The report says: ‘Food production in drylands is threatened by water shortages, climate change, land degradation and persistent poverty. Climate change may have a major impact on drylands as temperatures become more extreme (hot and cold), rainfall declines, groundwater tables drop and climate zones shift. Although climate change will likely increase aridity, the actual risk to agriculture is difficult to quantify. In addition, economic uncertainties and social unrest can lead to “debilitating levels of outmigration and instability” in drylands, which, besides the direct toll on human presence, may lead to lower agricultural productivity and further stagnation and marginalisation of local economies.’ As an important footnote, the consequence of both increased desertification and accelerated deforestation mean the stark reality is that it will become increasingly difficult, if not impossible, to mitigate the on-going effects of climate change.

Urgent response needed It needs to be said at this juncture, despite the grim analysis, the New World Atlas of Desertification is supposed to be more useful than a mere message of doom. Whilst it provides a comprehensive, evidencebased assessment of land degradation at a global level it more importantly highlights the urgency to adopt corrective measures. It provides an evidence-based information resource to steer policymakers and decision makers on agricultural restoration strategies. The previous editions of the EU’s Atlas, assessing desertification, were published in 1992 ahead of the Earth Summit in Rio de Janeiro, and also in 1998. This time around the results produced for the Atlas were created with new advanced data processing methods. EU scientists used thousands of computers and 1.8 petrabytes of satellite data, a volume of data equivalent to 2.7 million CD-ROM discs, to piece together a highly accurate portrait of the current state of desertification on Earth. Tibor Navracsics, Commissioner for Education, Culture, Youth and Sport, responsible for the Joint Research Centre (JRC), was clear that the evidence from the new data required an urgent response from countries around the world. “Over the past twenty years, since the publication of the last edition of the World Atlas of Desertification, pressures on land and soil have increased dramatically,” said Navracsics. “To preserve our planet for future generations, we urgently need to change the way we treat these precious resources.”

Increased migrations It’s clear from the new evidence that human activity, population growth and increased consumption have converged to put unprecedented pressure on the planet’s natural resources. Human migratory patterns, something which has led to political and cultural division around the world since the recent turmoil in the Middle East especially, will increase. By 2050, up to 700 million people are estimated to be displaced due to issues linked to scarce land resources. Dr. Stergomena Lawrence Tax, southern African development community (SADC) Executive Secretary recently addressed this issue and said: “Land degradation and drought are challenges that are intimately linked to food insecurity, migration and unemployment. In just 15 years, the number of international migrants worldwide has risen from 173 million in 2000 to 244 million in 2015, some of which are a result of environmental challenges. Rapid population growth and changing consumption patterns have generated excessive pressure on our finite land resources, leading to land degradation around the world. Globally, thirty percent of all land has lost its true value due to degradation.” The New World Atlas of Desertification provides examples of how human activity drives species to extinction, threatens food security, intensifies climate change and leads to people being displaced from their homes.

It’s an equation heading toward a tipping point, a kind of ‘peak farming’ moment in the future. As more agricultural land is needed, more land is becoming unusable. www.euresearcher.com

35


Can we revive dead lands? The big question is, realistically, can anything be done? The burdens on the land will only increase as the population increases. Navracsics is hopeful the new resource will be a start point for understanding the issues and countering them. He said: “This new and much more advanced edition of the Atlas gives policymakers worldwide comprehensive and easily accessible insights into land degradation, its causes and potential remedies to tackle desertification and restoring degraded land.” Under the United Nations’ Sustainable Development Agenda, world leaders have committed to ‘combat desertification, restore degraded land and soil, including land affected by desertification, drought and floods, and strive to achieve a land degradation-neutral world’ by 2030. While at global level desertification is addressed by

36

the United Nations Convention to Combat Desertification (UNCCD), land degradation is a problem that concerns the United Nations Framework Convention on Combating Climate Change and the Convention on Biodiversity. The importance of land degradation and desertification led to the adoption of Sustainable Development Goal 15.3 aiming at land degradation neutrality. Restoration strategies rely on ideas like integrating land and water management, planting trees and plants, reintroducing species, seeding areas, care taking the land better with agricultural policies. It also means working closely with communities to manage their land, even changing the use of the land – for example, ‘sector changing’ to tourism instead of farming. As for the EU’s specific targets, Karmenu Vella, Commissioner for Environment, Maritime Affairs and Fisheries, highlights the importance

EU Research


©European Union

This new and much more advanced edition of the Atlas gives policymakers worldwide comprehensive and easily accessible insights into land degradation, its causes and potential remedies to tackle desertification and restoring degraded land. of “action on soil protection and sustainable land and water use in policy areas such as agriculture, forestry, energy and climate change.” It’s an approach recommended in the EU Soil Thematic Strategy, and as Vella assesses: “It’s our best hope of achieving land degradation neutrality in line with the 2030 Sustainable Development Goals”.

www.euresearcher.com

One thing is for certain, desertification is a worldwide problem that needs worldwide action. Anything short of this will mean rethinking harvests, food supply and where many of us choose to live. New World Atlas of Desertification: https://wad.jrc.ec.europa.eu/

37


Tailor-made fuels for the engines of tomorrow Great progress has been made over recent years in improving the efficiency of biofuel production, with scientists working on both optimising processes and improving fuel quality. Now the Fuel Science Center will be established to work towards the development of bio-hybrid fuels, as Dipl.-Ing. Bastian Lehrheuer explains. Many governments across the world are keen to harness the potential of waste material from plants and animals as a source of energy, part of the wider goal of reducing CO2 emissions. This is a central part of the agenda for researchers in Tailor-Made Fuels from Biomass (TMFB), a German Research Foundation (DFG) funded research Cluster of Excellence based at RWTH Aachen University in Germany. “We are targeting closed carbon cycles, so that we can help to reduce the impact of fuel and combustion on the climate and environment,” says Bastian Lehrheuer, Chief Operating Officer (COO) of the cluster. The wider goal is to efficiently produce fuels from biomass; Lehrheuer says it is important to consider the entire cycle in this respect. “CO2 is bound in biomass by nature. We don’t want to carelessly burn that biomass, we want to use that energy in the most efficient way,” he explains. “So we want to reduce CO2 and pollutant emissions on the production side as well as on the propulsion side.”

Well-to-wheel Researchers are looking at this from wellto-wheel, i.e. from the source of biomass,

38

to the transport of the materials and further processing, right through to the eventual use of the fuel in a vehicle. The biomass itself has to be formed of waste material, so that it does not limit food production capacity or negatively affect it in any other way. “With the processes and methodologies that we are investigating, it is possible to use not only wood chips as feedstock for biofuel

Lehrheuer. “We want to develop, on a more fundamental basis, the methodologies necessary to evaluate the whole life cycle of bio-based fuels.” The fuel production process itself is highly complex, and many different factors need to be taken into account to build a more complete picture and identify where efficiency could be improved.

We develop methodologies to evaluate holistically the potential of new sustainable fuel candidates to reduce CO2 and pollutant emissions. production, but also waste from the forestry industry, or straw,” outlines Lehrheuer. The aim here is not to develop the perfect fuel, but rather to develop effective, reliable methodologies. “We develop methodologies to evaluate holistically the potential of new biofuel candidates to reduce CO2 and pollutant emissions. We are devising simulation models that help us to find the optimum fuel components, or combinations of those components,” continues

Some steps of the fuel production process may produce some heat for example, which could potentially be used elsewhere within the process. “We look at the energy flows, the energy demand, and the resulting CO2 and pollutant emissions. The target is to reduce those, and to ultimately produce climateneutral fuels,” explains Lehrheuer. A balance needs to be struck here between enhancing the quality of the fuel and optimising the production process; Lehrheuer is convinced

EU Research


that improved simulation methods will help lead to the development of more sustainable solutions. “We have successfully established a model-based fuel design process within TMFB and now have the chance to substantially improve and extend it within the scope of the upcoming Fuel Science Center,” he stresses. This enables researchers to identify which combinations of processes and molecules are most effective. The main bio-based alternative fuel currently used for gasoline engines is ethanol, but Lehrheuer says other options could be explored as well. “If we use some other components in a fuel, like butanol, then this actually could lead to advantages in terms of both production and combustion. On the production side, a lot of effort is required to get a very pure molecule, so we can improve efficiency if we target fuel mixtures from the very beginning. It is also helpful on the combustion side, as we can increase the octane number, thereby improving efficiency and reducing the particulate emissions,” he explains. The fuel design process has been developed to identify these kinds of benefits, so that fuels can be designed and tailored in an integrated way. “We tailor the fuel to achieve an overall optimum in production and propulsion. The primary motivation behind this work is to reduce CO2 and pollutant emissions,” outlines Lehrheuer. Researchers are also considering the way a fuel candidate performs in an engine. While gasoline and diesel engines are the most common combustion systems currently on the market, other systems are in development, all with different requirements, so identifying the parameters by which the effectiveness of a fuel can be assessed is a complex task. “It is not only about finding those parameters, it is also about how we measure them. How do we determine those parameters? What is the allowable range, and what boundary conditions do we have to consider?” Lehrheuer points out. In the case of diesel fuel, the most obvious parameter is the cetane number, an indicator for the ignition behaviour, while other important aspects include the density of a fuel or the entropy of evaporation. “How well does the fuel evaporate under in-cylinder conditions?” continues Lehrheuer.

www.euresearcher.com

Another important consideration beyond a fuel’s performance in an engine is cost efficiency. A lot of attention is paid to developing biofuels that work efficiently in a combustion engine, yet it is also essential to consider their economic viability. “That is also a very interesting point, and it is incorporated in the model,” says Lehrheuer. This does not mean just looking at the overall cost of the eventual fuel, but also considering the by-products generated during production; Lehrheuer points to the example of ethanol again. “We have learned that sometimes by-products are more valuable than the product which is actually targeted. With ethanol, if we aim for 100 percent ethanol, we reduce the generation of by-products that can be very valuable, for example in the chemical sector,” he explains. “If we balance that efficiently, we can produce both ethanol and by-products. So we also have to consider such economic effects in our models.”

Fuel Science Center The DFG has recently approved funding for a further seven years of research that builds on the achievements of TMFB, and the new Fuel Science Center (FSC) will start in January 2019. “We are extending the work to include molecules derived from renewable energy sources,” outlines Lehrheuer. This involves combining the benefits of renewable energy and biomass to create what Lehrheuer and his colleagues call bio-hybrid fuels. “Biomass is a long-chain carbon source, and we combine those molecules with those from E-fuels to form new fuel candidates,” he says. “The FSC will officially start working in January, focusing on this combination. It will build on our ten years of experience with TMFB, and in particular on our model-based fuel design processes.” The flexibility of the production and the propulsion system is an important aspect of this research. It may be that in one season a high amount of biomass is available but a low amount of renewable energy, so a certain degree of flexibility is essential. Using resources more efficiently is central to developing sustainable fuel solutions, a topic that will be at the core of the FSC’s research. “We are doing fundamental research, and developing an entire methodology,” stresses Lehrheuer.

TAILOR-MADE FUELS FROM BIOMASS Cluster of Excellence EXC 236 Project Objectives

• TMFB was established in 2007 with the scientific objective of optimizing the entire process chain from biomass to vehicle propulsion. Using an interdisciplinary approach to perform research on new synthetic fuels obtained from biomass feedstock via targetdesigned production routes, TMFB explores efficient and clean future combustion systems. • Vision of the Cluster: Establish innovative and sustainable processes for the conversion of whole plants into fuels which are tailormade for novel low-temperature combustion engine processes with high efficiency and low pollutant emissions, paving the way to new generations of biomass fuels. • Continuing to pursue its long-term vision, the CoE “Tailor-Made Fuels from Biomass” aims to achieve the model-based description and optimization of the entire process chain from biomass to propulsion in the second funding period.

Project Funding

Funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany´s Excellence Strategy – EXC 236. http://www.dfg.de/en/index.jsp

Project Partners

Host University: RWTH Aachen University Participating non-university research institutions: • Max-Planck-Institut für Kohlenforschung Mühlheim • Forschungszentrum Jülich • Max-Planck-Institute for Chemical Energy Conversion

Contact Details

Dipl.-Ing. Bastian Lehrheuer Institute for Combustion Engines VKA RWTH Aachen University Fuel Design Center Schinkelstraße 8 52062 Aachen, Germany T: +49 241 80 95352 E: lehrheuer@vka.rwth-aachen.de W: www.vka.rwth-aachen.de / www. fuelcenter.rwth-aachen.de

Dipl.-Ing. Bastian Lehrheuer

Dipl.-Ing. Bastian Lehrheuer (m) studied Mechanical Engineering at RWTH Aachen University. In 2012 he started his academic career at the Institute for Combustion Engines. He was working for numerous research projects for gasoline combustion system development. In July 2018, he took over the responsibility as Chief Operating Officer (COO) of the Cluster of Excellence TMFB.

39


From carnivore to herbivore the evolution of vertebrate feeding behaviour The first land-based vertebrates are thought to have fed mainly on animal matter before they later began feeding on plants and other resources. Researchers in the VERTEBRATE HERBIVORY project are using two independent dietary proxies to investigate when and in which species this change occurred, as Professor Thomas Tütken explains. Early land-based vertebrates

are thought to have fed mainly on insects, an easily digested protein source, when they first found their way onto land around 380 million years ago. Later, the diets of some species began to evolve and diversify to include resources such as plants. This dietary transition is a topic of great interest to Professor Thomas Tütken, the principal investigator of an ERC-funded project in which researchers are studying the fossil record to gain deeper insights into the evolution of vertebrate feeding behaviour. “We are using two independent dietary proxies, one chemical and one mechanical. We are analysing the tooth enamel isotopically and also studying the enamel surface microscopically to determine how the food physically abraded the tooth surface,” he explains. Researchers aim to use these two complementary sources of information to distinguish between species which fed on different resources, specifically plant and animal matter. “In terms of animal matter, we also want to resolve insect-feeding from meat/bonefeeding,” continues Professor Tütken. Controlled feeding experiments with small mammals, birds and reptiles raised on plant-, insect- and meat-

Validation

based diets are a cornerstone of this research, enabling scientists to validate these dietary proxies. The ultimate goal is to then apply these proxies to fossils to understand the evolution of plant-feeding among vertebrates.

You are what you eat isotope analysis of bioapatite A key part of this work centres around analysing stable calcium (Ca) and strontium (Sr) isotopes in vertebrate fossils. “Hard tissue is often preserved from vertebrates. The best geochemical archive is usually the tooth enamel, because it’s already been in vivo mineralised to around 96 percent. That means there is less chance of modification during the fossilisation process,” he outlines. The key target in the project is therefore enamel, yet researchers are also able to analyse fossil bones too, at least for the Ca isotopes. “This gives us opportunities to investigate fossils even from toothless taxa such as birds, turtles and some dinosaurs as well as from specimens for which teeth are not available for minimally invasive sampling,” says Professor Tütken. The ERC project team is combining analysis of stable Ca and Sr isotopes with 3D surface texture analysis to gain a deeper understanding of the

Diet Proxies

vertebrate diet (Fig. 1). Calcium is an important aspect of this work as it is primarily obtained through dietary ingestion. Furthermore, Ca is the major component of bioapatite - the mineral matter of which teeth and bones are comprised. “Calcium accounts for around 38 percent of bioapatite. Its abundance enables us to analyse the Ca isotope composition of tiny, sub-mg amounts of enamel or bone and, importantly, calcium is not easily biased by fossilisation processes, compared to trace elements,” outlines Professor Tütken. Strontium isotopes are another important source of information. “We look at the ratios of the two stable Sr isotopes, 88Sr and 86 Sr, as well as radiogenic 87Sr and stable 86Sr. The former provides complementary, diet-related information similar to that provided by Ca isotopes (Fig. 2), whereas the latter can be used as a provenance fingerprint for the geological substrate on which the food was ingested,” says Professor Tütken. A key principle here is isotope fractionation, essentially meaning changes in isotope ratios. Evidence suggests there is a general decrease in Ca isotope ratios along the food chain. “The 44Ca/42Ca ratio decreases both by the preferential transfer of the light Ca isotope into

Application

Fig. 1. Dietary proxy tool box: combined Ca isotope and enamel surface texture analysis applied in a single tooth approach. Reconstruction of Gastornis by Hellmund & Stache 2015, photo: M. Scholz, Halle (Saale).

40

EU Research


plants via root systems, and from the plant to the vertebrate hard tissues,” says Professor Tütken. The current thinking is that this fractionation occurs due to biomineralisation the formation of bone and enamel. “Therefore bones and teeth are enriched in light Ca isotopes relative to the diet,” he continues. “A vertebrate feeding on animal matter may mostly eat the flesh, but if they digest even a little bit of bone rich in isotopically light Ca, then the calcium isotope signal from the diet is dominated by the small fraction of bone and not by the meat.” This is very different from a herbivorous vertebrate, which does not have this light isotope source of calcium from the bone. This means that herbivores tend to have isotopically heavier bones and teeth, allowing scientists to distinguish between bone/ meat-feeders and plant-feeders (Fig. 2). Professor Tütken and his colleagues are also looking at vertebrates which feed on insects. “Insectivores like termite-eating aardvarks and bats tend to have more enriched calcium isotope values in their bones and teeth, and are distinct from both herbivores and carnivores (Fig. 2),” he explains. There are still overlaps between different feeding groups though, so research in this area is ongoing. “Ideally, we would be able to refine this picture and better understand the difference between insect-, plant-, and flesh/bone-feeding. This will be crucial for correct diet assignment of extinct species,” he outlines.

Dental wear reflects diet surface texture analysis of teeth While tooth morphology is an important indicator of diet, it may not reveal what an animal has actually ingested and digested. The use of non-destructive 3D surface texture (3DST) analysis via confocal light microscopy on teeth represents another independent source of information with which researchers can build a more detailed picture of vertebrate diet. Professor Tütken and his colleagues are using what can broadly be described as a nanometre scale 3-D model of the tooth surface. “We can characterise peaks, valleys and plateaus on teeth,” he says. The number, shape and orientation of these features (Fig. 3) caused by dental wear are thought to be characteristic of certain diets. However, this assumption needs to be validated, which is being done through controlled feeding studies and by analysing teeth of animals from the wild that are known to be dietary specialists in the different feeding groups. “Then we can compare those adaptations, look for similarities in wear patterns, and thereby make inferences about the abrasiveness potential (i.e. type) of the diet,” he continues.

www.euresearcher.com

Fig. 2. Stable Ca and Sr isotope compositions of extant mammal bones distinguish animal- from plant-feeders.

VERTEBRATE HERBIVORY Evolution of herbivory in vertebrates: developing combined isotope (Ca, Sr) and dental surface texture analysis as deep time diet proxies Project Objectives

There are abrasive materials in all of the different feeding categories and distinguishing between them can be a complex task. This data is nevertheless an important independent proxy for food toughness and hardness, which complements the isotope data. The ideal scenario would be to combine analyses of the two dietary proxies, on the same tooth. “A challenge here is that reptiles, unlike mammals, do not chew and therefore have different and less food-tooth contacts. Therefore, we are currently working to establish a 3DST reference frame for extant reptiles to assess the diet of extinct reptiles.” The broader goal of the project is to develop a well-defined dietary toolbox of analytical methods, that will ultimately be applied to fossil teeth of dinosaurs and mammal-like reptiles, from which mammals evolved, to infer their diets (Fig. 1). Professor Tütken and colleagues have already used these techniques to reconstruct the diets of some iconic extinct vertebrates. “We successfully used Ca isotopes to demonstrate that the giant flightless bird Gastornis was a herbivore despite its massive bony beak, and we’re currently trying to assess the amount of bone consumed by T-Rex and other theropods using a combined enamel surface texture and Ca isotope approach,” he explains. The ERC group is planning to study food webs in the fossil record to better understand extinction and speciation. “We could gain information as to whether flexibility in their diet could have been a factor in the success or demise of some taxa,” he says. In the future, these methods may also be used to investigate diet-related research questions in archaeology or ecology. Fig. 3. Photosimulation of the enamel surface texture of Iguana iguana, an extant herbivorous reptile.

The ERC Consolidator Grant VERTEBRATE HERBIVORY aims at developing a new toolbox for dietary reconstructions by combining 3D dental surface wear as well as calcium and strontium isotope analysis of teeth. These non-destructive, respectively, minimally invasive techniques will be first validated in controlled feeding experiments and then applied to fossil teeth for dietary reconstructions of extinct vertebrates and past food webs. The ultimate goal is to assess the evolution of plant-feeding among mammal ancestors and dinosaurs.

Project Funding

• E uropean Research Council (ERC): Consolidator Grant (No. 681450) VERTEBRATE HERBIVORY

Project Partners

• Vetsuisse, Universität Zürich •M ax-Planck-Institut für Chemie, Mainz (MPIC) • Centrum für Naturkunde, Universität Hamburg (CenNak)

Contact Details

Professor Thomas Tütken Institut für Geowissenschaften AG für Angewandte und Analytische Paläontologie Johannes Gutenberg-Universität Mainz J.-J.-Becher-Weg 21 55128 Mainz GERMANY T: +49 (0) 6131-39-22837 E: tuetken@uni-mainz.de W: http://www.paleontology.uni-mainz.de/pub_tt.html

Professor Thomas Tütken

Professor Thomas Tütken is academic senior councilor at JGU Mainz. He develops and applies geochemical methods in palaeontology to reconstruct the palaeobiology and palaeoecology of extinct vertebrates as well as to understand fossilization processes of bones and teeth. Currently he performs deep time dietary and body temperature reconstructions of mammal-like reptiles and dinosaurs. http://www.researcherid.com/rid/E-4988-2010

41


The building blocks of new materials A deeper understanding of complex particles and the way they interact with each other could open up the possibility of researchers using them as building blocks in the design and development of new materials. We spoke to Dr Laura Rossi about her work on the synthesis of complex colloidal particles. A lot of

attention in research is focused on investigating the properties of certain particles and the way they interact with each other, which could open up new possibilities in material design. Based at TU Delft in the Netherlands, Dr Laura Rossi and her colleagues’ research centres around the design and preparation of colloidal particles, which can be broadly described as any objects in a length scale of between a few nanometres and a few microns. “We’re using colloidal particles as building blocks in materials development,” she outlines. This is a challenging area, so rather than looking at complex materials, Dr Rossi is currently focusing her energy on twodimensional structures. “2-d materials are the first step towards more complex structures. The idea is that they are easier to produce and to understand than 3-d materials,” she says.

Colloidal particles The focus in research is on using colloidal particles to develop such 2-d structures, work which is built on continued investigation into their properties and the way they interact with each other. The size of the particles is an important attribute in this respect. “We use particles which are big enough that we can image them with simple optical techniques like light microscopy, while they are also small enough that, under certain aspects, they interact in the same way as atoms and molecules,” explains Dr Rossi. It is possible to image these particles at a high degree of precision, from which researchers can gain new insights into how they assemble. “As the particles themselves are relatively large, we can image them while the process happens, and that’s very important,” stresses Dr Rossi. “By imaging single particles, while the process is happening, we can learn more about how it happens.” This research also holds important implications in terms of the wider goal of preparing the colloidal particles so that they assemble themselves in particular ways. The various different materials that are present in nature are created using building blocks that interact through very specific and directional interactions. “We’re looking

42

Colloidal cubes with a definite dipole moment spontaneously form large 2D ordered structures useful, for instance, to study defect dynamics.

at using magnetic interactions to provide directional attachments between colloids,” says Dr Rossi. The use of magnetism in this respect is not entirely straightforward; Dr Rossi and her colleagues are looking to build on existing foundations in this area. “The idea is to use what we already know about physics to create a new colloidal building block that

colloidal particles. “Colloidal particles have been used to model atomic and molecular systems, but it’s usually been in recording somewhat simpler systems, like glasses or simple crystals, and only in recent years have more complex colloidal structures started appearing,” she explains. The aim is to achieve a high level of specificity, where the

The idea is to use what we already know about physics to create a new colloidal building block that helps us

design new materials.

helps us design new materials,” she outlines. “Ultimately, what we want to have is a programmeable building block.” From this point, researchers could then look to use these building blocks in the development of 2-d materials with specific geometries, such as graphene for instance, a material that has generated a lot of interest due to its unique electrical, optical, thermal and mechanical properties. The important point in this respect is programmability, yet Dr Rossi says this is very hard to obtain in

colloids behave in exactly the desired way. “In situations when we have a specific geometry in mind, we need to be able to reverseengineer the building blocks to obtain that specific geometry,” continues Dr Rossi. The way that magnetic colloids interact with each other is very different from other colloids, which is why it’s very important to understand the fundamental nature of these interactions. Researchers are looking to put magnets onto the colloids in specific locations, aiming to create a

EU Research


DESIGN OF 2D SOFT MATERIALS Design of 2-dimensional soft materials Project Objectives

The aim of the project is to develop smart colloidal building blocks that interact through very specific magnetic interactions. These particles are programmed to form 2D assemblies with predefined architectures which are useful to study single-particle dynamics in 2D structures and to design novel soft materials such as membranes and coatings.

Project Funding

Funded by the Netherlands Organisation for Scientific Research • https://www.nwo.nl/en/funding/ourfunding-instruments/nwo/innovationalresearch-incentives-scheme/veni/index.html

Project Partners

Stefano Sacanna group Molecular Design Institute Department of Chemistry New York University

Contact Details

Project Coordinator, Professor Laura Rossi Delft University of Technology Van der Maasweg 9, Room D2.180 2629HZ Delft the Netherlands T: +31 30 253 3406 E: L.Rossi@tudelft.nl W: http://www.mycolloids.com/research.html

Model of a 2D network predicted to arise from colloidal particles having 3 magnetic patches.

certain structure. “If you locate these tiny magnets into the colloids, in a very specific pattern, then you can create the structure that you want,” outlines Dr Rossi. One of the structures Dr Rossi and her colleagues are targetting is that of graphene. “The carbon atoms in graphene are arranged in a honeycomb lattice,” she says. “You can imagine having the same architecture, where carbon atoms are replaced by colloids. This is quite difficult to obtain, because the interactions between colloids must be precisely engineered.” A second structure that Dr Rossi is targeting is more amorphous, in that it is not a periodic structure. A high level of control over the properties of the colloids is required in order to form these structures, and Dr Rossi says that results so far are promising. “We see that with magnetic interactions between the colloidal particles, we can make certain structures. We can obtain structures that can be tuned, and can be programmed in a way – this is a first step towards programmable structures,” she outlines. Dr Rossi and her colleagues have found that it

www.euresearcher.com

is possible to re-configure certain structures using magnetic building blocks. “Not only can you make structures, but you can also change them by applying for instance an external magnetic field – and the particle will respond to this external field,” she says. “We can think of moving from a structure with a specific property to a structure with a completely different property, by applying some external stimuli or cues.” This feature plays an important role, for instance, in the development of re-configurable materials. The next step could be to build further on these initial findings and look towards more complex structures, with the eventual objective of moving towards 3-d structures and designing materials with specific mechanical or optical properties. This holds important implications for the future of materials development. “We want to understand how we can tune the interactions of the colloids so that we can change the properties of a material. That’s looking towards the future, when we have a higher level of control over these structures,” says Dr Rossi.

L. Rossi, J.G. Donaldson, J.-M. Meijer, A. V. Petukhov, D. Kleckner, S.S. Kantorovich, W. T. M. Irvine, A. P. Philipse and S. Sacanna Competing anisotropic interactions in dipolar hematite cube assemblies, Soft Matter, 14, 1080-1087 (2018). S. Sacanna, L. Rossi and D. J. Pine Magnetic click colloidal assembly, Journal of the American Chemical Society, 134, 6112-6115 (2012).

Professor Laura Rossi

Dr Laura Rossi is an Assistant Professor in the Advanced Soft Matter Group at TU Delft. Earlier in her career she undertook training at leading soft matter laboratories, including the Center for Soft Matter Research at New York University and the van’t Hoff Laboratory for Physical and Colloid Chemistry at Utrecht University where she received her doctoral degree in 2012.

43


© NASA, ESA, the Hubble Heritage Team (STScI/AURA), and A. Aloisi (STScI/ESA)

Tracing the first steps of galaxy evolution Galaxy surveys are a central tool in investigating the history of the universe, allowing researchers to look far back in cosmic time and study different stages of galaxy evolution. Researchers in the BUILDUP project aim to reconstruct galaxy assembly and evolution when the universe was young, as Professor Karina Caputi explains. The universe is

thought to be around 13.8 billion years old, and researchers continue to probe ever deeper into its history and evolution. While in the first billion years after the Big Bang the universe was comprised solely of gas and light, gravitational collapse then led to the formation of stars and galaxies. “That period started less than one billion years after the Big Bang, it’s called reionization. That’s because the first stars and galaxies produced copious amounts of ultraviolet photons that were able to ionize the surrounding gas,” explains Professor Karina Caputi, the Principal Investigator of the BUILDUP project. This is considered to be the starting point of galaxy evolution, now Professor Caputi and her colleagues in the project aim to build a fuller picture of how galaxies formed over the subsequent few billion years after reionization. “Our aim is to reconstruct galaxy evolution over that epoch, just after the formation of the first stars and galaxies. This project is about the first steps of galaxy evolution,” she explains.

Reaching the peak activity epoch The peak activity epoch of the universe, during

44

which star formation activity reached a peak, happened long after the reionization period, around 10 billion years ago. “How that peak activity epoch was reached, from the beginning of the reionization period, is quite unknown. That period is still relatively unexplored,” says Professor Caputi. This period is the main focus of the project. “The big goal of the project is to look at galaxy evolution, after the formation of the first galaxies. So, the period between the formation of the first galaxies, until the universe reached that peak activity epoch,” continues Professor Caputi. “During that epoch, galaxies were intensively forming stars across the universe. While there are more galaxies today, the star formation rates and the overall levels of activity are much lower.” By way of comparison the Milky Way currently forms one or two new stars a year, whereas star formation rates during the peak activity epoch were well into double figures, and sometimes even reached the hundreds. Professor Caputi and her colleagues in the project are analysing data gathered from the Spitzer Space Telescope to investigate the formation of galaxies from right back beyond the peak activity epoch. “Our

galaxy survey starts with data from Spitzer – but we also make use of other sources of data,” she outlines. The survey has a unique combination of area and depth, from which researchers hope to learn more about how galaxies developed during the young universe. “Sometimes you have very deep images of a small region of the sky, while on other occasions we have large images which are not very deep, so we cannot see that far,” says Professor Caputi. “The advantage of our survey is that it has a very nice combination of area and depth.” The Spitzer Space Telescope itself has been operating for almost 15 years now, sending back images of vast numbers of stars and galaxies at infrared wavelengths. While there is clearly enormous scope for observation from a telescope like Spitzer, Professor Caputi focuses her attention on a particular patch of sky. “If you want to see unknown galaxies then you need to stay for a long time in the same blank piece of the sky,” she explains. By devoting more time to observing the same patch of the sky, researchers hope to learn about previously unknown galaxies. “Certain galaxies are clearly visible – but if you don’t

EU Research


During the peak activity epoch,

galaxies were intensively forming stars across the universe.

While there are more galaxies today, the star

formation rates and the overall levels of activity are much lower.

stay long enough then you would never see new galaxies,” points out Professor Caputi. “If you stay for a long time in a particular part of the sky, and you keep the telescope there, then you would collect enough photons to eventually see them.” This is allowing researchers to look back deep into cosmic time and collect statistics and data on around 300,000 galaxies, all present in the patch of sky that Professor Caputi and her colleagues are studying. These galaxies were formed at different points in time, so there are a number of factors to consider in studying them. “We study the spectra of these galaxies by looking at many images from different wavelengths. We look at the shape of these spectra, and the way these have shifted into the red,” explains Professor Caputi. This shift is produced by the Doppler effect in astronomy, which is caused by the expansion of the universe. “It’s the same kind of effect that you experience with sound, for instance as an ambulance approaches with sirens wailing,” outlines Professor Caputi. “As it approaches, the tone will be quite high, then as it moves away, the pitch of the sound will be much lower.”

www.euresearcher.com

Galaxy formation models Researchers are also working with galaxy formation models, based on the Cold Dark Matter (CDM) framework, which generate predictions of how cold dark matter was distributed in the universe at different periods. While a number of other cosmological models have been developed, the CDM model has been the most successful in terms of predicting the general properties of galaxies that we see today. “The CDM makes specific predictions of how galaxies are distributed in the universe spatially, and the large-scale structures of the universe that we see today. If you compare those predictions to the data that we see in nearby galaxies, the CDM framework is the best,” says Professor Caputi. This does not mean that models are entirely accurate however, and Professor Caputi says there are some limitations. “Sometimes the models cannot correctly predict how observed galaxies behave, and their properties at different cosmic times,” she acknowledges. By comparing observed data with predictions from models, researchers can then assess whether those predictions are correct or not, and learn more about the galaxies they are observing. Professor Caputi and her colleagues are working on a number of papers, and significant progress is being made. “We are really understanding much better how galaxy evolution proceeded in the time before the peak activity epoch of the universe,” she says. The development of the James Webb Telescope will also open up new observational possibilities; Professor Caputi is keen to work with data from this telescope in future. “This new telescope will be extremely powerful, but there will also be limitations. We need to understand that preferentially before the data arrives, in order to understand how we are going to work with that data and make it suitable for analysis,” she outlines.

BUILDUP Galaxy Buildup in the Young Universe: from the First Billion Years through the Peak Activity Epoch Project Objectives

The aim of the BUILDUP project is to reconstruct the history of galaxy assembly and evolution from the first billion years of cosmic time through the peak activity epoch, which occurred 10 billion years ago, in order to provide fundamental constraints for galaxy evolution models. BUILDUP involves the scientific exploitation of one of the largest observing programmes ever conducted with the Spitzer Space Telescope, and constitutes a bridge between current and future generations of infrared galaxy surveys.

Project Funding

This Project is exclusively funded by the European Research Council and the only beneficiary is the University of Groningen.

Contact Details

Professor Karina I. Caputi Kapteyn Astronomical Institute University of Groningen P.O. Box 800 9700 AV Groningen The Netherlands T: +31 50 363 8325 E: karina@astro.rug.nl W: https://www.astro.rug.nl/~karina/ ERC_BUILDUP.html Cordis: https://cordis.europa.eu/project/ rcn/200775_en.html - Caputi, K.I. et al., Star Formation in Galaxies at z~4-5 from the SMUVS Survey: A Clear Starburst/Mainsequence Bimodality for Hα Emitters on the SFR-M* Plane, The Astrophysical Journal, 849, 45 (2017). - Cowley, W. I. et al., The Galaxy–Halo Connection for z=1.5-5 as Revealed by the Spitzer Matching Survey of the UltraVISTA Ultra-deep Stripes, The Astrophysical Journal, 853, 69 (2018). - Bisigello, L. et al., The Impact of JWST Broadband Filter Choice on Photometric Redshift Estimation, The Astrophysical Journal Supplement, 227, 19 (2016). - Ashby, M.N.L. et al., Spitzer Matching survey of the UltraVISTA ultra-deep Stripes (SMUVS): Full-mission IRAC Mosaics and Catalogs, The Astrophysical Journal Supplement, in press (2018)

Professor Karina Caputi

X-ray: ©NASA/CXC/UMass Lowell/S. Laycock et al.; Optical: Bill Snyder Astrophotography.

Karina Caputi is Associate Professor of Astronomy and ERC Consolidator Grant Laureate at the Kapetyn Astronomical Institute, University of Groningen. She previously held research positions in France, the UK and Switzerland. She has a combined background in physics and astronomy.

45


A new approach to flutter in flight Flutter can break an aircraft’s wings, so it’s an important consideration in design. Researchers in the Flexop project are developing new tools to both model flutter and to control it during flight, which could open up new possibilities in design and boost the competitiveness of European industry, as Dr Bálint Vanek explains. A phenomenon which

occurs when aerodynamics and structures couple in an unstable way, flutter can cause aircraft wings to break, and so is a correspondingly important consideration in design and development. Based at the Institute for Computer Science and Control Hungarian Academy of Sciences (MTA SZTAKI) in Budapest, Dr Balint Vanek is the coordinator of Flexop, a project which brings together academic and commercial partners to develop multidisciplinary aircraft design capabilities. “We are developing methods to actively control aeroelastic deformation,” he explains. Researchers in the project are developing active control methods to get the flutter phenomenon under control, and to ensure that it doesn’t have adverse consequences.“We are developing methods and tools to model aero-servo-elastic behaviour, while we are also developing methods and tools to get it under control,” continues Dr Vanek. This could open up new possibilities in aircraft design and development. Currently, aircraft are designed to be free from flutter under all flying conditions, but with sophisticated active control methods, it would potentially be possible to control flutter during flight. “We are looking at whether this flutter phenomenon could be brought closer to the normal operating range in future, where there’s always an active control method in place which keeps it under control,” says Dr Vanek. This means that designs previously considered unviable could be looked at afresh, while also opening up new possibilities. “For example, an airframe which is very flexible and prone to flutter may be allowed to fly in future. This is because flutter will not occur, given that you have a smart control system in place which is able to suppress the oscillations caused by flutter,” explains Dr Vanek. “It would be like an additional layer of the fly-by-wire system.

Aeroelastic behaviour A deep understanding of flutter and the behaviour of aircraft wings during flight is central to this wider goal. An aircraft wing can be thought of as a kind of flexible beam; researchers are exploring new methods of controlling its behaviour in terms of bending,

46

torsion and certain other axes. “With both active and passive feedback mechanisms, it’s possible to reduce the loads on the wing. Very sophisticated mathematical models and tools are required to model the aeroelastic behaviour and design the feedback controls,” outlines Dr Vanek. Within the project, researchers have access to large volumes of data on how wings behave during flight. “We have developed a complete unmanned demonstrator aircraft to validate our tools, and we have put a large number of sensors on the wings of this aircraft. Each wing has hundreds of measurement points to monitor wing stress and deformation with fibrebrags, in addition to the six accelerometers and 12 rate gyros, which are used for onboard the flight control computer in the development of the active control mechanisms,” says Dr Vanek. This data is then used to validate the mathematical models, while also helping researchers gain deeper insights into the behaviour of the wing during flight. Research into wing aeroelasticity is coupled with investigations into the flight control system during the design process, representing a new approach. “Aeroelastic analysis and flight control have traditionally been done separately,” explains Dr Vanek. However, researchers in the FLEXOP project are now trying to couple these two areas in the design process, which could help improve efficiency and reduce the costs of aircraft development. “As it’s in the design stage the feedback loop is closed earlier, so potentially the design changes are less costly,” he explains. “It’s a very multi-disciplinary project, yet traditionally flight control and aeroelasticity people did not work together that closely.” The two groups may not always interpret an aeroelasticity model in the same way, so methods and tools have been developed in the project to effectively bridge the gap. Alongside the tools and methods that are

EU Research


FLEXOP Flutter free flight envelope expansion for economical performance improvement Project Objectives

The FLEXOP project is about developing multidisciplinary aircraft design capabilities for Europe that will increase competitiveness with emerging markets - particularly in terms of aircraft development costs. A closer coupling of wing aeroelasticity and flight control systems in the design process opens new opportunities to explore previously unviable designs. Common methods and tools across the disciplines also provide a way to rapidly adapt existing designs into derivative aircraft, at a reduced technological risk.

Project Funding

Horizon 2020. https://flexop.eu/facts

Project Partners

• https://flexop.eu/partners

Contact Details

being developed, an aircraft itself is being built within the project, which Dr Vanek says is an important feedback mechanism. “It effectively helps to get both groups on the same page, both the aeroelasticity people and the flight control people,” he says. The aircraft itself is nearing completion, and different experimental wing-sets have been developed. “The aircraft will be flying at some point over the next few months, and we will start gathering data, while in parallel we are also going to work on the scale-up,” continues Dr Vanek. “In the scale-up we want to essentially extrapolate all the tools and methodologies to a full-sized commercial aircraft.”

our preliminary results at the CleanSky 2 meetings, and there was a lot of interest in the technologies that have been developed in the project,” says Dr Vanek. The wider goal in the project is to help improve the efficiency of the aircraft design cycle, and also the performance of aircraft in terms of issues like fuel efficiency and their overall payload, or their carrying capacity. One major objective in the field is to design a lighter airframe, which would have knockon effects. “If the airframe is lighter, then less fuel is required. So in principle, if you don’t have that much structural reinforcement in the wing, then you can use that to either fly

We are looking at whether the flutter phenomenon could be brought closer to the normal operating range in future, where there’s always an active control method in place which keeps it under control. This could provide a significant boost to the European aircraft industry at a time when it faces increasingly intense competition from emerging markets. A number of different scale-up configurations have been identified, now Dr Vanek is looking to explore the likely impact on aircraft performance. “We want to look at the trade-offs in terms of payload and fuel efficiency in those new configurations, trade-offs which are enabled by the technologies and methodologies which we have developed in the project,” he outlines. These are of course important considerations for manufacturers, and so there is a lot of interest from the commercial sector in the project’s research. “I presented

www.euresearcher.com

Virág Bodor, Project Manager MTA-SZTAKI Hungarian Academy of Sciences Institute for Computer Science and Control H-1111 Budapest, Kende u. 13-17. Hungary T: +36 1 279 6220 E: bodor.virag@sztaki.mta.hu W: https://flexop.eu Dr Bálint Vanek

Dr Bálint Vanek’s research interest includes safety aspects of both manned and unmanned aerial vehicles (UAV), which includes areas of aeroservoelasticity, fault detection and reconfigurable control of aerospace systems, mainly using the linear parameter-varying (LPV) framework. UAVs and their insertion into the common airspace is another research topic I am interested in. I am working on developing a safety critical avionics architecture for small UAVs and also providing them a vision based sense-and-avoid capability. I am the coordinator of the H2020 Research Project FLEXOP, working on flexible aircraft control technologies, to mitigate the effects of flutter, and also the PI of EU H2020 VISION.

further or to carry a greater payload,” says Dr Vanek. Improved aeroelastic tailoring and optimisation of the aircraft structure could lead to further weight reductions, an important consideration for industry. “For example, aerocomposites company FACC is a tier-one supplier to several aerospace companies. It’s very important for them to see how aircraft wings will be built in future,” stresses Dr Vanek.

47


New possibilities in artificial intelligence Huge volumes of data are available today on many of the most pressing challenges facing society, yet current computing architectures are relatively inefficient at handling this data. We spoke to Dr Abu Sebastian about the work of the Projestor project in developing a new memory device concept which could open up new possibilities in Artificial Intelligence. The typical Artificial

Intelligence (AI) system is currently run on computers which consume kilowatts or even megawatts of power. By contrast the human brain consumes something like 20 watts, so there is a big disparity in the amount of power required. “A big factor behind this disparity is that the architecture with which the brain computes is fundamentally different from how conventional computers work. For example, in the brain there is no notion of processing happening in one place, and memory being stored in another – they are co-located,” explains IBM researcher Dr Abu Sebastian. As the Principal Investigator of the Projestor project, Dr Sebastian is now exploring a new computing paradigm. “In the project we are trying to come up with new physical computing systems which are, in a sense, inspired by the architecture of the brain,” he explains. “We are looking at having co-located memory and processing for example, and trying to bring down this disparity between conventional computing systems and the brain. We call it in-memory computing.”

New computing architecture This research is built on a recognition of the limitations of the existing computing architectures. Current computing systems are based on the von Neumann architecture, where storage is in one location and the processing engine elsewhere; every time a computation is performed, data is shuttled between the two units. “This is very energy-intensive,” outlines Dr Sebastian. Researchers in the project aim to develop new solutions, new architectures, where data doesn’t need to be moved between memory and processing, greatly improving energy efficiency. “Conventional memory can be viewed as a place where data is dumped, and there isn’t any intelligence associated with that. It’s just a place where you store stuff,” says Dr Sebastian. “We’re trying to design a new kind of computational memory chip, where the memory is actually an active participant in the computation process. It’s not enough to view memory as a place where you simply store information.”

48

A computational memory chip based on phase change memory devices.

The human brain is an importance source of inspiration in this respect, as it’s among the best available examples of a cognitive computer, with very closely entwined memory and processing units. While

using memory itself, through the synapses. So, how about building memory chips which can compute?” he outlines. “We take the fact that you are processing in memory by exploiting some physical attributes of the storage mechanism. In the case of brain architecture, that’s done through the synapses. The way in which we store information in our computational memory units is through a type of memory called phase change memory.” In a phase change memory device, information is stored in terms of the atomic configuration of certain types of materials. If the atoms are in an ordered state, it is logic 1, and if it is in a disordered state the logic is 0, while there are also intermediate states. “The idea is to use the physical properties

We are trying to come up with new physical computing systems which are, in a sense, inspired by the architecture of the brain. We are looking at having co-located memory and processing for example. researchers are still only scratching the surface when it comes to understanding the architecture of the brain, Dr Sebastian says it’s still possible to derive useful insights. “The brain seems to compute

of these devices to perform computational tasks. This is all operating at the nanoscale - the devices are typically around tens of nanometres in length,” outlines Dr Sebastian. Research is progressing well, and Dr Sebastian says work has already reached quite an advanced stage. “We have developed a computational memory platform with these new devices that I have talked about,” he continues. “Recently, we have shown how in-memory computing can tackle problems such as solving systems of linear equations as well as unsupervised learning of temporal correlations between event-based data streams. A significant part of the research conducted within Projestor is also aimed at improving the properties of the memory devices such as increasing the precision of computation when using these devices.”

Artificial intelligence The wider background to this research is the start of the cognitive computing era and the increasing pervasiveness of

EU Research


AI in everyday life. While some of us may still think of AI as being quite a futuristic idea, Dr Sebastian says that in fact it is in everyday use today. “Web companies use it for search, mobile cameras use it for identifying objects in pictures. There are many applications, including in robotics, the Internet of Things (IoT) and healthcare,” he outlines. For example, Dr Sebastian’s colleagues are using AI and sensors to understand the progression of lung diseases based on the sound of a cough and the colour of a patient’s salvia. “The AI uses this data to potentially predict when the patient is about to have an acute event, called exacerbations, where they nearly suffocate and require re-hospitalization. Unless the doctor is monitoring them 24/7 this would be impossible,” he points out. Many of these types of AI applications are currently run in the cloud, so when an individual uses their mobile phone to translate some text for example, the translation and the data processing are done elsewhere. While the cloud

will still be required for more advanced AI applications, improving the energy efficiency of computing systems could open up the possibility of performing some tasks in a mobile device and widening the use of AI. “The idea is to make AI more pervasive – to make your mobile phone or your healthcare device more intelligent for example. So we really want to bring down the energy consumption of these computers that are doing the AI,” says Dr Sebastian. A lot of progress has been made in these terms, with the project making an important contribution to the development of a new computing architecture. “The goal is to establish in-memory computing as a post von-Neumann computing paradigm,” continues Dr Sebastian. This could eventually lead to AI becoming a part of more everyday devices. It is already quite commonly used today of course, but in order to widen its applications still further, Dr Sebastian says continued technological development is required.

PROJESTOR Projected Memristor: A nanoscale device for cognitive computing Project Objectives

We are entering the era of cognitive computing, which holds great promise in terms of deriving intelligence and knowledge from huge volumes of data. However, it is becoming clear that to build efficient cognitive computers, we need to transition to non-von Neumann architectures where memory and logic coexist in some form. The main goal of the Projestor project is to explore such a memory device concept

Project Funding

ERC Consolidator Grant https://www.ibm.com/blogs/research/2016/03/ ibm-scientist-abu-sebastian-develops-futurememory-computer-paradigms-prestigiouseuropean-grant/

Contact Details

Dr Abu Sebastian Principal Research Staff Member IBM Research - Zurich Säumerstrasse 4 8803 Rüschlikon Switzerland T: +41 44 724 8684 E: ASE@zurich.ibm.com W: http://www.erc-projestor.eu/ W: https://researcher.watson.ibm.com/ researcher/view.php?person=zurich-ASE M. Salinga, B. Kersting, I. Ronneberger, V.P. Jonnalagadda, X.T. Vu, M. Le Gallo, I. Giannopoulos, O. Cojocaru-Miredin, R. Mazzarello, A. Sebastian, “Monatomic phase change memory,” Nature Materials, 17 681–685, 2018 (Cover). M. Le Gallo, D. Krebs, F. Zipolli, M. Salinga, A. Sebastian, “Collective structural relaxation in phase-change memory devices,” Adv. Electronic Materials, 2018. M. Le Gallo, A. Sebastian, R. Mathis, M. Manica, H. Giefers, T. Tuma, C. Bekas, A. Curioni, E. Eleftheriou, “Mixed-precision in-memory computing,” Nature Electronics 1, 246–253, 2018. A. Sebastian, T. Tuma, N. Papandreou, M. Le Gallo, L. Kull, T. Parnell, E. Eleftheriou, “Temporal correlation detection using computational phase-change memory,” Nature Communications 8, 2017. T. Tuma, A. Pantazi, M. Le Gallo, A. Sebastian, E. Eleftheriou, “Stochastic phase-change neurons,” Nature Nanotechnology 11(8), 2016. (Cover)

Dr Abu Sebastian

Dr Abu Sebastian is a Principal Research Staff Member and Master Inventor at IBM Research - Zurich. He is actively researching the area of non-von Neumann computing with the intent of connecting the technological elements with applications such as artificial intelligence. He has published over 150 articles and holds over 40 granted patents.

www.euresearcher.com

49


Photograph by Veronika Hohenegger, The Leibniz Supercomputing Centre (LRZ).

Racks and brains The age of exascale computing is coming, and dramatic increases in computational speed could lead to important scientific breakthroughs, yet software must also evolve in line with changes to the computing paradigm. The SPPEXA programme supports research into software for tomorrow’s computing environment, as Dr Benjamin Uekerman explains. The first exascale computer, capable of 1018 floating point operations (flops) a second, is expected to come into operation at some point around the mid-2020s. This dramatic increase in speed could lead to important breakthroughs across many areas of science, so a lot of resources have been invested in the race to develop the first exascale computer, yet less attention has been paid to the software required for this new computing environment. “People often forget that you also need funding to develop the software to run on these machines,” says Dr Benjamin Uekermann, SPPEXA Programme Manager. A priority programme funded by the German Research Foundation (DFG), the aim in SPPEXA is to develop software for this new computing paradigm. “With a new high-performance computing (HPC) architecture, different algorithms are required to improve simulation efficiency,” continues Dr Uekermann.

Computer simulations This research holds relevance to several scientific disciplines, from astro-physics to bioinformatics to climate change, with supercomputers commonly used to run many different types of simulations. Typically, when the people running such a simulation find that it is running too slowly, or they want to address a different problem, they

50

ask supercomputing specialists to modify and improve the code, yet Dr Uekermann says this approach has its limitations. “You are only scratching the surface of where you can improve. Sometimes, you need to change more fundamental things to improve efficiency on such HPC systems,” he explains. The approach within the programme is more collaborative, bringing computing specialists together with researchers in specific applications. “We have joint projects with application people and mathematics specialists in HPC. They work together for six years and develop the application,” outlines Dr Uekermann.

era of massive parallelism, Dr Uekermann says certain things will change. “For example, we agree that there is a high probability that there will be single failing entities on a regular basis,” he says. If a single chip crashes, we cannot allow that to lead to the complete programme crashing. So we need software and algorithms that are capable of interfering with such situations. There are different ways of tackling this, the most elegant uses algorithms to resolve the problem.” This type of problem is among the six research directions being pursued within SPPEXA, which include computational algorithms, system software and

With a new high-performance computing architecture, different algorithms are required to

improve simulation efficiency. There are 17 projects within SPPEXA, focussing on different applications and areas of research, including astro-physics, plasma physics and fluid dynamics. More powerful computers allow researchers in these disciplines to include more data in simulations and so gain deeper insights into important questions, yet effective software is essential to realising these benefits. “With new machines you need new software,” points out Dr Uekermann. As computing enters the

programming. The other research directions are data management and exploration, and then application software, a set of priorities which reflects the complexity of exascale computing. “Different projects work on different layers that we need for software. So there are projects that work on very low levels, where it’s really about the systems software. For example, there’s one project in SPPEXA that is developing a file system that can work on an exascale machine,” says Dr Uekermann.

EU Research


SPPEXA

The SPPEXA team

Software for Exascale Computing Project Objectives

SPPEXA addresses fundamental research on various aspects of HPC software, which is particularly urgent against the background that we are currently entering the era of ubiquitous massive parallelism. This massive parallelism only will smooth the way for extreme computing up to exascale, i.e. computations with 1018 floating point operations per second, and the insight resulting from those simulations.

Project Funding

SPPEXA is funded by the German Research Foundation (DFG) (German Priority Programme 1648). Approximately €24 Million in total.

Project Partners

SPPEXA involves more than 50 universities and research institutes in Germany, France, Japan and world-wide.

Other projects within SPPEXA look at higher levels, where computer scientists work in close collaboration with application specialists to develop new algorithms. “In order to tackle the problem, you have to work with all these layers, to make them ready for exascale computing,” stresses Dr Uekermann. The various projects in SPPEXA differ quite significantly in some respects however, with some taking an evolutionary approach, while others are more revolutionary. In the former case, Dr Uekermann says the focus may be on something quite low-risk. “This could be a software feature that is quite low-risk, but which has to be ported to exascale, so that it is ready by the time exascale machines are available,” he explains. Other projects however are more high-risk, where scientists try to revolutionise certain application domains, for example by changing the basic principles behind simulations. “There is more risk behind this type of project. If it works out, then many codes will have to rewritten, to be ported to a new programming paradigm. But it would also have a much bigger impact,” continues Dr Uekermann. “We have ported many applications and made them ready for Exascale computing. We have been working on existing codes with existing communities.” A lot of progress has also been made in the more revolutionary projects, with the development of prototypes which show that certain things can be dramatically improved with the development of new techniques. The focus within SPPEXA is on software for exascale computers, yet Dr Uekermann says the project’s research also holds implications for today’s machines. “The vast majority of the things that we have developed will run on today’s supercomputers. But they have been built in a way that would allow them

www.euresearcher.com

to be transferred to the next generation of machines,” he outlines. The most powerful supercomputer in the world currently is Summit, which has a performance of 122.3 petaflops (a petaflop=1015 floating point operations a second). The most powerful machine in Germany meanwhile is the recently set up SuperMUC-NG, based at the Leibnitz Supercomputing Centre near Munich, which has a peak performance of 26.9 PFLop/s. These machines are already capable of dealing with huge volumes of data, yet an exascale machine would represent a dramatic step forward on even these levels of performance. While the point at which such a machine will be developed is difficult to predict, it is moving closer. “The point at which an exascale computer will be developed is difficult to precisely forecast, but it’s not far away in terms of the science,” says Dr Uekermann. This underlines the importance of continued research into software for exascale computers, and for supercomputers more generally. While the SPPEXA programme itself cannot be extended beyond its current term, Dr Uekermann hopes that research will continue. “We are working hard to convince funding bodies that there is a long-term need to develop software for HPC machines,” he says. Many of the projects in the second phase of SPPEXA brought together researchers from different countries, which could form the basis for further investigation, while Dr Uekermann says there is also the possibility of wider collaborations in future. “We have worked together with the French and Japanese funding agencies, to look at co-funding international projects. This has turned out to be really valuable, because we can bring together people with different backgrounds and share expertise,” he outlines.

Contact Details

Project Manager, Dr. Benjamin Uekermann Technische Universität München Institut für Informatik Boltzmannstraße 3, 85748 Garching bei München T: +49 89 289 18600 E: uekerman@in.tum.de W: www.sppexa.de W: https://link.springer.com/ book/10.1007/978-3-319-40528-5

Benjamin Uekermann

Benjamin Uekermann studied Applied Mathematics and Computer Science at the Technical University of Munich. Since receiving his PhD in 2016, he works as project manager of SPPEXA. His research focuses on software development for multi-physics simulations. He is currently one of the main developers of the coupling library preCICE.

51


Building a clearer picture of dynamic visualisations Dynamic visualisations can provide valuable insights into time-varying data like stock prices and traffic information, helping us to identify patterns and get a clearer overall picture of how they are likely to evolve. Dr Kevin Verbeek tells us about his work in analysing the stability of the geometric algorithms behind these visualisations. Many people monitor

traffic data on phone apps or other technologies before setting out on a journey, helping them to avoid hot-spots and get to their destination as quickly as possible. The data on traffic patterns constantly evolves, which is an important consideration in terms of presenting it to drivers in an accessible and digestible way. “If you want to take that data and turn it into a picture using an algorithm, then you need to make sure that if the data slightly changes, then the picture that you produce doesn’t suddenly completely change,” points out Dr Kevin Verbeek. The goal when visualising data, whether it’s traffic information, stock prices, or something else entirely, is to identify patterns. “You want to figure out exactly what is happening in the data. A visual picture will give you an idea or summary of that, so you need this to be very stable,” explains Dr Verbeek.

Algorithm stability The stability of the underlying algorithms behind these visual pictures is an important factor in this respect, a topic which lies at the heart of Dr Verbeek’s research. Based at the Eindhoven University of Technology, Dr Verbeek is taking a step back to look at fundamental questions around the stability of an algorithm. “The question arises of how do I measure stability? How do I analyse stability? What tools can I use to do this formally?” he outlines. In a simple sense, an algorithm takes an input and generates an output; one measure of stability involves relating the two. “If the difference in the input is small, then the difference in the output should also be small. This is the most important aspect,” explains Dr Verbeek. “From a data visualization point of view, it would be even better if the difference in output corresponded directly to the difference in input.”

There are also trade-offs to be made in terms of the specific characteristics of an algorithm. While it might seem desirable for an algorithm to find the optimal solution for a particular problem, this may have an impact on its stability. “The interesting thing is that if you always want to find the optimal solution – which in some cases we can, although it’s hard – then this optimal solution might be very unstable. We often find that there is a trade-off between how good your solution can be, and how stable the algorithm is,” says Dr Verbeek. With a little bit more leeway with respect to the nature of the solution, it’s then possible to make an algorithm more stable. “The optimum solution is very fixed, it’s a small point. If you give yourself a little more room in terms of the output that you’re producing, then you can use that room to become more stable,” explains Dr Verbeek. A key objective in this research is to develop new analysis techniques to assess the stability

The new treemapping algorithm developed by Dr. Verbeek and his collaborators is much stabler than existing algorithms (Hilbert and squarified), whilst at the same time keeping the quality of the visualization high over all time steps.

52

EU Research


STABLE GEOMETRIC ALGORITHMS Less confusion from dynamic information Project Objectives

An algorithm is stable if small changes in the input lead to small changes in the output. Stable algorithms play an important role in, for example, the visualization of time-varying data. The main goals of this project are to develop a new theoretical framework for algorithm stability, introduce new methodologies to analyze algorithm stability, and to develop new stable (geometric) algorithms.

Project Funding

NWO VENI: “Stable Geometric Algorithms” (2015, 248k€)

of algorithms, while Dr Verbeek also intends to apply them and see if they actually work in practice. While this is not an entirely new area of investigation, researchers are yet to develop a reliable theory on how to deal with the nature of stable algorithms. “That is one of the main goals of this project – to define what kinds of analysis can be used and what kinds of tools are useful,” says Dr Verbeek. These tools could in principle be applied to any kind of algorithm, although Dr Verbeek acknowledges that they may be more applicable to some than others. “Usually the most interesting types of algorithms in terms of our research are those where your input is essentially continuous. In that way you can map it to something that takes place over time,” he says. The stability of the underlying algorithm has a practical impact on visualisation, yet currently there is no rigorous theory to debate these problems and provide a framework for further research. By developing the theoretical foundations behind this essentially practical problem, Dr Verbeek hopes to lay the groundwork

different languages in a sense, but there are also other challenges to overcome. “I usually try to work on the intersection between theoretical and practical research, so I write theoretical papers and I write practical papers. But there has been a reluctance to accept these kinds of papers – where if you work on the intersection and you write a theory paper, then somehow the theoretical aspects are considered to not be deep enough, because you based it on a real practical problem,” says Dr Verbeek. “On the other hand, if you write a practical paper that uses such theory to solve a practical problem, then the methods are often not well understood and are considered to be too complicated.”

Knowledge transfer Knowledge transfer is therefore a challenge, but it’s also something that inspires Dr Verbeek’s work and will remain a central consideration. While the project itself will conclude relatively soon, investigations into both the theoretical and practical issues around the stability of algorithms

The interesting thing is that if you always want to find the optimal solution then this optimal solution might be very unstable. We often find that there is a trade-off between how good your solution can be, and how stable the algorithm is. for further research in future. “The goal of the project is to actually ignite a level of interest in these theoretical problems relating to the stability of algorithms. When you have theoretical knowledge about a particular problem, you build an understanding that you can then use in practice,” he explains. This can lead to new insights and spur continued research. “We’ve been writing papers describing how we use these techniques, and aiming to show that they can lead to interesting insights,” outlines Dr Verbeek. This is partly about encouraging theoretical and practical researchers to work in close collaboration, which in practice is very difficult however. Not only do they almost speak

www.euresearcher.com

is ongoing, and Dr Verbeek hopes to help build and strengthen the research community. “I would like some of these things to be taken up the research community, in particular seeing other people using these techniques and tools to prove certain things about their algorithms,” he outlines. Dr Verbeek also plans to attend conferences and publish papers, which will help generate interest. “The idea is to get this to as many conferences and workshops as we can, where we can present our work and push these ideas,” he says. “The interaction between theoretical and practical issues is important, as that’s where the problem lies. There’s really an interaction between the two.”

Project Partners

• Prof. dr. Bettina Speckmann • Dr Wouter Meulemans • ir. Jules Wulms • ir. Max Sondag

Contact Details

Project Coordinator, Dr Kevin Verbeek Mathematics and Computer Science TU Eindhoven, MF 4.106 P.O. Box 513 5600 MB Eindhoven T: +31 40 247 8926 E: k.a.b.verbeek@tue.nl W: https://www.win.tue.nl/~kverbeek W: https://www.win.tue.nl/aga/ I. van der Hoog, M. van Kreveld, W. Meulemans, K. Verbeek, and J. Wulms. Topological Stability of Kinetic k -Centers. ArXiv:1810.00794, 2018. W. Meulemans, B. Speckmann, K. Verbeek, and J. Wulms. A Framework for Algorithm Stability and its Application to Kinetic Euclidean MSTs. In Proc. 13th Latin American Theoretical Informatics Symposium (LATIN), pp. 805—819, 2018. M. Sondag, B. Speckmann, and K. Verbeek. Stable treemaps via local moves. IEEE Transactions on Visualization and Computer Graphics, 24(1):729-738, 2018.

Dr Kevin Verbeek

Dr Kevin Verbeek is an assistant professor in the Applied Geometric Algorithms group at TU Eindhoven. His main research interests lie within the area of computational geometry. Kevin is specialized in using theoretical techniques from computational geometry to solve real-world problems, mostly in the area of information visualization.

53


What can we expect for scientists? Brexit is still dominating the news, as it has been for the last two years. A 585 page draft of a deal was proposed on 14 November 2018 by the UK’s Cabinet and then almost instantly, contested by Brexiteers wanting a harder line, inside the UK Government. Amongst the arguments, we have to ask, ‘how will the UK fund science and what impacts will there be on innovation and scientific discovery? By Richard Forsyth

B

rexit negotiations have been fraught and difficult. In reality, what the split means for the UK and Europe is still largely unknown because this is an unprecedented breakup, after more than 40 years of being entwined with policies, collaborations and funding. For scientists, a question that is weighing heavily is, ‘how will this impact on research projects?’

The mood of scientists As people pour over the details of the draft of the agreement to leave proposed, there are still a great many open questions which will not be answered until after Brexit, beyond March 2019. However, we can glean a few details for those wanting something solid. For example, the draft does explain that the UK will leave Euratom, the European Atomic Energy Community when it leaves the EU, which will affect some projects. The mood of scientists, in the UK at least, remains to the larger majority, relatively grim. In a BBC interview, the Nobel Prize winner, Sir Paul Nurse, summed up the mood in UK science circles prior to the Brexit draft: “The increasing chaos, and that’s what it looks like to us all, around the Brexit negotiations is causing huge concern among scientists.” The sentiment of discontent was summed up in a poll by the journal Nature, that questioned 907 researchers in the UK, revealing that 83% supported Remain whilst just 12% opted for Leave. A majority of 78% saw leaving the EU as harmful and over 50% agreed that it would be ‘very harmful’. These are clear indications of how scientists feel in the UK, that they will struggle to thrive under the shadow of Brexit’s negative connotations. At the time of writing this feature, in-fighting within the UK Government is dominating the news and ministers have remained reluctant to comment on fine details about science funding and related impact. It’s important for governments, UK and European, to realise that scientific research is a crucial piece of the Brexit puzzle. Scientific research for healthcare, new technology, best practice and new understanding of our world, these are key drivers for all future development.

54

EU Research


Historically, the aims of UK scientists have been bolstered through European funding. The Campaign for Science and Engineering points out that between 2007 and 2013, the UK gave €5.4bn to the EU to fund research and development and in the same period received €8.8bn back for research. Since 2014, UK scientists have been awarded just over €4.7bn of research funding via Horizon 2020. Britain is considered a European leader in science, with renowned research institutions, and that must be relevant to why the country has received around 15 percent of the Horizon 2020 funding. With Brexit, UK scientists are anxious to understand what will change and any new mechanisms for funding and collaboration. There is the argument that if the UK does not send over the money to the EU pot in the first place they can at least keep that money for UK science projects but of course, that’s less

than the EU pays back. There is also no guarantee how the money that would have been sent over will be distributed. It’s the lack of current guarantees, that is so absent from any commitments from the UK Government. In fairness, there has been one important pledge that’s allowed researchers to take a breath in the confusion, and that is that the UK has agreed to underwrite current awardees via UK Research & Innovation (UKRI) in the case of a ‘no deal’ Brexit. But whilst the funding will be honoured to commitments already made prior to the year 2020, moving beyond this, it is still, despite the draft, quite unclear what impacts can be expected. There will be a new Horizon Europe funding programme in 2021 with €100 bn available for research, but access for this will be a discussion point postBrexit – so after March 2019.

A figure of £7 billion extra funding has been touted by Science Minister, Sam Gyimah to avoid a ‘cliff edge’ scenario for UK science with Brexit. www.euresearcher.com

55


Where to find new money? Horizon 2020 may still have available funding lines to the UK with a few conditions, but conversely there are funding lines for research programmes that we can expect to be cut off. The most notable cut will be from the European Research Council, which has given out €1.29bn to the UK. The UK has won 860 grants from the ERC to date. There is also the Marie SklodowskaCurie funding worth €0.7 bn to UK researchers and the SME instruments grants, worth €140m to UK scientists. These three funding programmes are all strictly only open to European Union members. That’s a sizable chunk of funding that’s being taken away from UK researchers and innovators, representing around 45% of the UK’s receipts from Horizon 2020. So where can new money come from? The UK’s Conservative Government has been a mixture of earnest yet vague in committing to science funding specifics post Brexit, largely because it’s a fluid situation where anything could change before Brexit becomes reality. The official stance at the time of writing, from UK Government, in their own words is: ‘Looking beyond 2020, the UK remains committed to ongoing collaboration in research and innovation and wants to work with the EU on a mutually beneficial outcome’. They also mention a proposal to form a cooperative accord with the EU on science and innovation and a commitment to ‘increase UK research and development spending to 2.4% of GDP by 2027.’ A figure of £7bn extra funding has been touted by the now resigned Science Minister, Sam Gyimah to avoid a ‘cliff edge’ scenario for UK science with Brexit. There is also the opportunity for private sector funding but with firms wanting Return on Investment in the relative short term, some science will not marry the aims associated with business growth. In truth, it is widely expected that shortfalls will be inevitable in the UK, as it will take considerable time to accumulate and leverage the kind of funding pots that have been available through Europe. What’s more, there is a justifiable fear in the UK science community that for major collaborative scientific projects in Europe, British scientists will no longer be in a position to lead.

The challenge of Brexit is not just around the funding, it’s around stunting a cross pollination of talent across borders. Whilst it is true many UK researchers feel they will be left out in the cold with European research, researchers from Europe coming into Britain, also feel less than welcome.

Brain drains and border issues The challenge of Brexit is not just around the funding, it’s around stunting a cross pollination of talent across borders. The latest information suggests that short trips between the UK and Europe will not need a visa, which is encouraging. For longer stays conditions will need to be met. There is a mention in the UK Government’s ‘official line’ about ‘openness to international talent,’ but here lies a problem. It’s been noticed that the number of EU researchers set to come to the UK is dropping. Whilst it is true many UK researchers feel they will be left out in the cold with European research, researchers from Europe coming into Britain, also feel less than welcome due to the Brexit effect. It’s fair to say that some scientists from Europe do not have a rose-tinted view of Brexit and it does not send good signals for relocating to the UK and for collaborations. With a major motivation of Brexiteers ‘to control borders’ and limit immigration there has been an associated rise in racist attacks blamed on Brexit’s more publicly vocal, emboldened bigots. The UK can be perceived as less welcoming in contrast to pre-2016 years. For nurturing international teams of the best in the field, it’s not the perfect climate for working together and especially from a UK HQ. Equally, from the European view, consortiums with UK scientists will be scrutinised for viability. Where possible, scientists are applying for dual citizenship to counter restrictive immigration issues, but such acts seem desperate. There is a distinct unease for researchers from Europe based in the UK and for those who were thinking of coming over to the country for research purposes. New Scientist magazine recently carried out a survey of 4,300 people working in science and engineering based in the UK and Europe and there was an overwhelming consensus that Brexit both affected the UK’s ability to

56

EU Research


The most interesting and perhaps controversial proposal Gyimah has hinted at, is that the Government in the UK are considering making a financial contribution larger than all the other associated members combined.

www.euresearcher.com

attract European talent and also to retain existing staff in the UK. The magazine stated that 63 per cent of UK-based respondents who were hiring staff in the scientific community said Brexit would affect their recruitment activities during 2018-19. They concluded: ‘Four in 10 said it would make it harder to retain existing staff, and 30 per cent believed it would mean they would have to recruit more staff from within the UK.’

Is there any good news with Brexit? A recent interview published by Science magazine, with UK former Science Minister, Sam Gyimah, confirms whilst Gyimah voted Remain in the referendum, he was tasked with finding positives for scientists in the UK. Gyimah pointed out a £1 billion ‘future leaders’ programme will be open to the brightest talent from around the world. The idea is that visa regimes will be easier for the best researchers to come to the UK. He also pointed out that the science in the UK can be nimble and agile. Indeed, this was a potential bonus that was stated prior to the vote’s result to leave the European Union. The idea is that without 27 nations to check in with, on agreed strict regulations, scientific endeavours can accelerate processes that would otherwise be bogged down by excessive bureaucracy. An example is the EU’s clinical trials directive that some hailed as a disaster. By adding excessive levels of complexity in the form of rules and checks, laboratories found that clinical trials became unmanageable, resulting in non-commercial trials in the UK falling from over 600 between 2000-2003 to less than 300 in 2004-2007. It meant that only the big pharma companies could cope with the costs needed for trials, squeezing out academics and SMEs. It took the EU years to correct this obvious hinderance to progress and novel treatments. Post-Brexit, in theory, the UK could create unique regulations which will make it easier to be faster and more ‘agile’, as Gyimah put it. Whilst this could give the UK an edge in progress, breakthroughs may not be easily exportable, as the rules for use in other territories will still be present. The point is, to sell drugs in the EU you follow EU regulations. The most interesting and perhaps controversial proposal Gyimah has hinted at, is that the Government in the UK are considering making a financial contribution larger than all the other associated members combined, in return for more focus on scientific excellence in programs, as opposed to building capacity and capability in other EU countries. As part of that proposal, a caveat is that UK scientists would need to be involved in discussions and decisions. It’s controversial because for one, many Leave voters were voting on the premise the UK did not need to send money over to the EU anymore whereas the suggestion is that the amount of money, for science at least, will be increased. However, the Department for Business, Energy and Industrial Strategy told EU Research: “It is right on Horizon Europe, that we do not associate at any cost.” Brexit is one of those uncommon deals, and one of those complex stories, where the uncertainties of implications dominate and it’s very much a fluid and volatile process. The process of a country untangling its roots from an institution this large, complex and integrated, is hard to imagine. Therefore, all promises and pledges feel delicate. There is also a significant public campaign to hold a second referendum on the vote in the UK that is not going away, even in the latest stages of the lead up to Brexit. It feels like anything could happen and by the time this feature goes to print it may have already happened. Rising above the political mayhem, practical barriers and enormous blows in this era of Brexit, scientists need to find ways to work and collaborate for the benefit of grander moral obligations, discoveries and truths. The consensus amongst scientists seems clear enough, that is that Brexit is a bad idea. Whatever happens, the hope is that through the maze of difficulties, research can still function effectively across borders, reliant on the combined power of international minds and specialisms. The only truth we can rely on at present, is that no one really has a grasp on how Brexit will pan out for all the stakeholders involved and that’s because this kind of scenario is new to everyone involved in it.

57


Delving deep into the human imagination The human imagination is enormously powerful, allowing us to conceive of far-fetched scenarios that go far beyond our individual experience. Yet imagination is useful for everyday practical reasoning as well. Researchers in the Logic of Conceivability project are using mathematical tools to investigate the logic of the human imagination, as Dr Peter Hawke explains. The human imagination allows us to dream up fantastical scenarios far beyond the reality of our everyday experience, whether it’s a future living on Mars, pink elephants flying through the sky, or just an extravagant holiday in a glamorous location. While this use of the imagination is very flexible, we do not generally think of unbridled imaginings as a reliable guide to what is really possible. However, the imagination is also used in ordinary, practical reasoning, where it is more disciplined. “When we use the imagination in our ordinary reasoning, it’s actually very constrained and very disciplined,” says Dr Peter Hawke, a researcher in the LoC project. We seem to use our imagination when we draw conclusions about what’s really possible in certain situations, even for mundane tasks such as moving furniture out of a house. “One way to decide whether there is space to take a couch through a door is to imagine it. You don’t do it, you imagine it,” points out Dr Hawke. “After you’ve run that kind of imaginative exercise, you can often conclude that it’s really possible that you can take the couch through the door. You make your decision about what to do accordingly.”

LoC project This duality represents a challenge to researchers seeking to model the rational imagination, a topic that lies at the heart of the LoC project, an ERC-funded initiative based at the University of St. Andrews and the University of Amsterdam, under the leadership of Professor Franz Berto. In the project researchers are delving deeper into the underlying basis behind the human capacity to imagine or conceive of different scenarios, work which follows in the established tradition of using mathematical logic to study certain mental states. “The mental states we’re interested in are intentional representational states. The intentional part means that these states are thought of as having subject matter, so they’re about

58

something. The representational part means that they represent what they’re about as being a certain way,” explains Dr Hawke. For example, the content of such a state could be that John drives a red car, which is about John and represents him as driving a red car. “This could be the content of a belief, a piece of knowledge, or of an imagining. I could believe that John drives a red car, I could know that John drives a red car, I could imagine that John drives a red car,” continues Dr Hawke.

This shows that there is a basic structure to thought,” he says. Such structure opens the door to the use of mathematical techniques in studying such mental states. The tools of mathematical logic are used to model three main things in relation to these intentional representational states. “We model the states themselves. We model the language that is used for talking and reasoning about those states. Finally, we model how those two things connect to each other,” continues Dr Hawke. A key part of the project’s work centres around modelling how the imagination operates in reasoning, in particular how it is constrained to be a useful guide on what is really possible. “You want to acknowledge that imagination can be quite free-flowing, but then you also want to have a model in which certain constraints can play a role,” says Dr Hawke. An important source of such constraints is the imaginer’s knowledge or beliefs about the actual world: when we try to imagine possible outcomes in order to decide what to do, our imaginings are highly sensitive to how we take the world to actually

There have been lots of interesting proposals around using

mathematics to model the content of thought in a useful way, and there’s been a lot of debate around the right way to model that. We’re trying to model how constraints can be added to the content of the imagination. Basic laws govern such mental states. For example, if one believes that John drives a red car, then it must be that one also believes that John drives a car. While these connections are often simple, they form a basis on which Professor Berto and his colleagues can build a fuller picture of the rational imagination. “A mental state with a certain content and another mental state with slightly different content can stand in a logical relationship.

be. This suggests that there is an interesting interaction between disciplined uses of the imagination and other mental states, such as knowledge and belief. The project is working at an abstract level, with Dr Hawke and his colleagues aiming to develop a mathematical theory for modelling both the content of thought and the different mental states that have such content. “There have been lots of interesting proposals around using

EU Research


mathematics to model the content of thought in a useful way, and there’s been a lot of debate around the right way to model it,” he outlines. “We’re trying to model the sorts of constraints that can be added to the content of the imagination in a very abstract way, and the way in which use of the imagination can update one’s beliefs.” An individual using their imagination in reasoning starts with a certain body of beliefs, for example general beliefs about the manoeuvrability of couches. Once that individual has run an imaginative exercise, those beliefs may be updated in line with the results. For example, she might form the belief that it is possible that a particular couch can fit through a particular doorway. “That’s the kind of dynamic process that we’re interested in modelling,” explains Dr Hawke. Alongside this dynamic part of the imagination, in which imaginative exercises can lead to belief update, there is also a static part, and both are being modelled within the project. “We can model imagination as a snapshot of the agent’s state at a particular point in time,” continues Dr Hawke. “For example, I might ask you to engage in a certain imaginative exercise, like imagining a detective walking down the street in the ‘30s. I’m asking you to hold a certain attitude towards that content. I’m not asking you to believe it, I’m not asking you to assume it’s true – I’m asking you to imagine it. That’s static, in the sense that we have tools for modelling that content and modelling the sorts of attitudes one can take towards it at an instant in time.”

Mathematical modelling The ultimate goal in research is to capture both these static and dynamic parts of the imagination within a model. The tension between the unconstrained and more disciplined uses of the imagination make this task more difficult. “The whole point

www.euresearcher.com

about mathematical modelling is to capture structure. So, if it is the case that some uses of the imagination are very unconstrained, then that makes it very difficult to model them,” says Dr Hawke. Nevertheless, Dr Hawke and his colleagues have been exploring the idea that general mental content has some structure, opening up the possibility of modelling even the flights of fantasy. “It doesn’t matter which intentional representational state you’re talking about -- those states always have content, and content always has some structure that one can model mathematically,” he explains. “If all content has structure, then there is something to be said about the structure of imagination, even in really wild cases.” Researchers believe that constrained, disciplined uses of the imagination promise an even richer target for mathematical modelling, since there is a structured relationship between such imaginative exercises and other kinds of mental states, such as holding the belief that something is possible. “Individuals are able to do those sorts of imaginative exercises and form new beliefs in response. So, there’s some structure there that we can use,” he says. The project will lead to important contributions to the scientific literature and touches on several other disciplines, with implications beyond logic. “Logic has cross-overs with economics and psychology, cognitive science and philosophy. So, we’re going into different areas to inform our theories,” continues Dr Hawke. “One of the nice things about working with mathematics is that it’s a common language in the sciences. So, it makes it a lot easier to transfer your tools and results.”

LoC The Logic of Conceivability: Modelling Rational Imagination with Non-Normal Modal Logics Project Objectives

We study the nature, structure and logic of intentional-representational mental states like knowledge, belief and imagination. Such states have content: they represent a certain subject matter as being a certain way. We aim at mathematical models of states and contents that are well informed by philosophy and cognitive science, and useful for artificial intelligence.

Project Funding

The LOC project is funded by the European Research Council, Consolidator grant no. 681404.

Contact Details

Project Coordinator, Francesco Berto University of St. Andrews, Scotland, in partnership with the Institute of Logic, Language and Computation at the University of Amsterdam, The Netherlands T: +31 064 111 9918 E: fb96@st-andrews.ac.uk W: http://projects.illc.uva.nl/conceivability/

Prof. Francesco Berto

Dr Peter Hawke

Professor Franz Berto is an Italian philosopher and logician specializing in ontology and nonclassical logics. He works as the University of St Andrews and at the University of Amsterdam and is the Principal Investigator of the Logic of Conceivability project. Dr Peter Hawke is a philosopher and logician. He earned his PhD from Stanford University in 2017, under Johan van Benthem and Krista Lawlor. He is currently stationed at the University of Amsterdam as a member of the Logic of Conceivability project, researching mental content, modality and the theory of knowledge. He is South African.

59


Remembering Apostolic Figures Memories of Jesus of Nazareth, as well as reflections on his life and work, are central to the Christian faith, so references to him and apostolic figures abound in the literature. We spoke to Dr Stephan Witetschek about his work in tracing ways of remembering these apostolic figures to build a deeper understanding of the origins of Christianity. The apostles are

central figures in the history of Christianity, both as bearers of tradition and interlocutors of Jesus, and also simply as people whose lives are worth remembering. While the circle of the twelve closest followers of Jesus, later known as ‘the Twelve Apostles’, are the most prominent figures in this respect, there are individuals outside this group who also played a similar function in early Christian literature. “Paul is a prime example, but also Mary Magdalene and James, the brother of Jesus,” says Dr Stephan Witetschek. As the Principal Investigator of the Memoria Apostolorum project, Dr Witetschek aims to build a deeper characterisation of the apostolic figures, looking beyond the group of twelve. “The main part of the project will be studies of individual apostolic figures. This is partly about developing biographies of these individuals as historical figures, but also, and more importantly, as literary characters. There is also a small sub-project looking at the Twelve as a group,” he outlines. This research involves delving deep into the textual sources (with texts in languages like Greek, Coptic and Syriac) and analysing the evidence relating to events dating back to antiquity. Memory itself is imperfect though, and our recollections of an event can be affected by subsequent discussion and debate on it, so to an extent memory is thought to be created, a crucial insight in modern memory studies; this

60

The Memoria Apostolorum team.

is something Dr Witetschek takes into account in his research. “We apply these concepts of memory theory in the study of antiquity in looking at objects, and trying to adapt these concepts to the things that we study,” he explains. The way that events are remembered, particularly in literature, does not necessarily always correspond with other evidence of those same events, yet usually memory takes a shape that makes sense in a certain context. “At some point individuals and groups agree on a certain shared memory. That’s what one would speak of as a social memory, which is an important category in my project,” says Dr Witetschek. “The theoretical framework is a notion of memory as being socially constructed, based on Pierre Nora’s notion of Les Lieux de Mémoire.”

Memoria Apostolorum The central question in research is that of why early Christians remembered the apostles in the way that they did, focusing on the first three

centuries AD, with Dr Witetschek analysing different forms of Christian literature to gain deeper insights. One area of interest in the project is how theological ideas evolved in line with the perception of the people connected with them. “The suspicion is that different images of an apostle coincide with different theological conceptions. Let’s take one example – say Peter is remembered as a family person, somebody who had a wife, a daughter and a mother-in-law,” says Dr Witetschek. This could be taken as supporting certain theological ideas, yet in a different time and context, memories of Peter may have been re-interpreted. “Say this historical information is then read in a context that is not really favourable towards families, towards getting married and having children, but rather inclined towards asceticism. Then we read the strange story of Saint Peter’s beautiful yet lame daughter that shows how this memory has become problematic,” continues Dr Witetschek. The story is preserved only on a Coptic Papyrus (P.Berlin 8502, pp. 128-132). For another example, one could mention different memories of Paul: In his letters, he styles himself as mainly a man of the written word who is not very good at speaking in public (2 Cor 10.9-11). Yet in the Acts of the Apostles we are presented with an image of Paul as a competent public speaker, with no reference at all to letter-writing. The characterisations of the apostles connected with certain theological concepts

EU Research


and ideas may have changed over time, in the service of certain shapes of Christianity or specific practices. A large part of the project’s work centres on gathering characterisations of the apostolic figures and trying to assess them, from which Dr Witetschek hopes to then learn more about prominent individuals. “Then we can trace parallels or differences, and also possible developments in the characterisation of a person,” he outlines. This relates primarily to the personal characteristics and traits of these different people, both good and bad, as described in the literature. “This is a tricky issue in New Testament studies – the question of whether a certain character is portrayed in a positive or negative way, whether they’re a good or not so good disciple. Then we can consider the meaning behind the inclusion of the negative traits of a person,” says Dr Witetschek.

rendered in very specific ways in different contexts,” outlines Dr Witetschek. This research is part of the wider picture of Christian history, and how certain acts of religious observance or practice evolved. New practices have been introduced over time, and Dr Witetschek believes it is important to understand their historic origins. “I take an observer’s perspective in research. This work is primarily about figuring out what was there in the first three centuries AD,” he says. The project will lead to several important contributions to the literature, including a compendium of apostles, which Dr Witetschek and his associates are preparing as a group. “It’s a concise presentation of the image of these figures that emerged in the first three centuries AD. So, this means looking at how they were presented in different texts, and exploring where there are connections between these different

For what reason, for what purpose, did early Christians remember the apostles in the way that they did? A good example would be Peter’s denial of knowing Jesus, which is documented in the New Testament. While this does not put Peter in a positive light, it is now established as part of the historical picture of him. “It will be interesting to figure out how these different views, and the memory of their weaknesses, play into the overall picture,” says Dr Witetschek. Overall, a general tendency to depict apostles more positively is apparent over time, although there are also stories of conflict between different apostolic figures. “A few of the gospels record some conflict, some tensions between James and Jesus for example, but he is later remembered and recalled as the brother of Jesus, the leader of a Christian group in Jerusalem, and also as a bearer of revelation. It’s interesting to see how the apostles are

www.euresearcher.com

presentations of apostolic figures,” he outlines. “We’re also working on a couple of monographs and dissertations on individual characters.” This will help researchers develop a more general description of the memory of apostles in the first three centuries AD, which Dr Witetschek hopes will engage the attention of those with an interest in theological investigation. “There is a certain historical responsibility of theology, and of the church in general in terms of its self-understanding, to be aware of its own past,” he says. “Things like ministry weren’t there from the very beginning, but developed over time and were constructed by theologians for certain reasons. My research and my project also plays a role for the church in understanding the historical developments behind this.”

MEMORIA APOSTOLORUM Memoria Apostolorum. Apostolic Figures in Early Christian Memory (1st-3rd centuries) Project Objectives

The project “Memoria Apostolorum” aims at understanding the references to apostles (or apostolic figures) in early Christian literature (1st-3rd centuries) as manifestations of Christian collective memory. Condensing and focussing this memory, they appear as “lieux de mémoire” and pivotal instances for the development of a collective identity of different Christian groups that is based on claims laid to figures of the past.

Project Funding

Funding by Deutsche Forschungsgemeinschaft: HeisenbergStipendium (WI 3620/4-1) and HeisenbergStelle (WI 3620/6-1) plus Sachbeihilfe (WI 3620/5-1).

Contact Details

Principal Investigator, Dr Stephan Witetschek LMU Munich, Faculty of Catholic Theology Professorship for Introduction to Biblical Studies Geschwister-Scholl-Platz 1 80539 Munich T: +49 89 2180 3585 E: stephan.witetschek@lmu.de W: https://www.kaththeol.uni-muenchen.de/ lehrstuehle/bibl_einleitung/memoria/index.html

W: http://gepris.dfg.de/gepris/

projekt/388226599 W: https://www.kaththeol.uni-muenchen. de/lehrstuehle/bibl_einleitung/memoria/ publikationen/index.html

Dr Stephan Witetschek

PD Dr. Stephan Witetschek is a Research Fellow in the Faculty of Catholic Theology at Ludwig-Maximillians University Munich. He has held research and teaching positions at institutions in Germany, UK and Belgium, and is currently Heisenberg Fellow of the German Research Foundation.

61


The Jewish and Christian faiths share common foundations, but by the Middle Ages were distinct from each other. Nonetheless, Jews lived side by side with Christians in northern Europe and interacted with Christians and their ideas. Focusing on the period between 1100-1350, researchers in the Beyond the Elite project aim to build a clearer picture of daily life at the time, as Professor Elisheva Baumgarten explains.

Getting to the heart of the Jewish community

Seal of the Jewish community of Augsburg. Impression in wax from the year 1298. / Photo: Arye Maimon-Institute for Jewish History, Trier University; photo of a seal cast, original impression in Stadtarchiv Augsburg.

The history of

the Jewish people in Europe dates back to the early Middle Ages, but there is a marked increase in the volume of material describing the daily lives of Jews after the time of the first crusade, between 1095-1099. Researchers in the Beyond the Elite project are delving into this material to investigate how Jewish people in Northern France and Germany between 1100-1350 conducted their daily lives. “We’re exploring the fabric of the Jewish community,” says Professor Elisheva Baumgarten, the project’s Principal Investigator. The focus of attention here is ‘ordinary’ members of the community rather than the elite. “We know a relatively large amount about learned men, but what was the distance between them and a less learned person? Did they have the same lifestyle? Did they have the same daily routine? Did they believe in different things? Did they observe rituals differently? These are the types of questions that we are asking,” explains Professor Baumgarten. This period was also marked by a change in attitude towards the Jewish community. In the early part of the period between 1100-1350, Jewish people were invited into cities in Northern France and Germany, but later on there was a higher level of persecution. “Something happened during that period, something changed. The way that scholars have always explained that change is by looking at intellectual discourse

62

Seal of the Jew Jakob Daniel from Trier. Impression in wax from the year 1348. / Photo: Arye Maimon-Institute for Jewish History, Trier University; photo of a seal cast, original impression in Landeshauptarchiv Koblenz.

and exchange between Jews and Christians,” outlines Professor Baumgarten. While this was undoubtedly an important factor behind changing attitudes, Professor Baumgarten believes it doesn’t tell the whole story and that more attention should be paid to daily life. “The story has to include more people and more everyday events,” she stresses. “We haven’t previously looked in-depth at relations in daily life between Jews and Christians. These issues around how a group goes from being a separate but integrated part of society to being persecuted are also very relevant to today’s world.”

Seal of the Jewish community of Regensburg. Impression in wax from the year 1356. / Photo: Arye Maimon-Institute for Jewish History, Trier University; photo of a seal cast, original impression in Bayerisches Hauptstaatsarchiv Munich.

we can,” she stresses. Texts and records are available, mainly written in Hebrew, but also in Latin and the vernacular, as well as many other artefacts. The newest member of the team studies the wax seals used on official documents, for example, which can lead to new insights. “Some of the seals during the high Middle Ages were very elaborate, very decorated, and they tell us about people’s values and how they saw themselves,” outlines Professor Baumgarten. “Other students work on architecture, Hebrew legal literature, as well as liturgy, and interpretations of liturgy.” This provides the basis for researchers to

We know a relatively large amount about learned men, but what was the distance between them and a less learned person? Did they have the same lifestyle? Did they have the same daily routine? Did

they believe in different things? Everyday life A range of sources are being considered within the project, as researchers seek to get to the heart of what everyday life was like for Jewish people during this period. Rather than focusing on a single type of source, Professor Baumgarten and her team aim to look at as many different sources and genres as possible. “In order to get a fuller picture of daily life, we need to consider as many different factors as

reconstruct people’s daily experiences, including where they worked, how they dressed, and how they expressed their identity as Jews in what were predominantly Christian surroundings. Jewish communities at this time were fairly small, numbering around a few dozen families at the most. “They would live next to Christians, within the Jewish quarter, that was home to both Jews and Christians,” outlines Professor Baumgarten. These communities came under

EU Research


varying degrees of pressure to convert, and Professor Baumgarten says that some people indeed chose to do so, for various reasons. “Maybe they thought their life would be better as Christians, they thought new opportunities would open up to them in the future,” she continues. “Some scholars have suggested that young men in particular were the most likely to convert, the teenagers and young men, who saw it as an opportunity for a better future.” Researchers are also looking at people’s willingness and desire to display their Jewish identity, as well as whether this was related to the level of pressure they were under to convert. Over this period certain forms of dress came to be associated with Jewish people, although they had previously been worn by people of other faiths. “Scholars of art history have shown that during the early part of this period very respectable Christian men wore a particular hat, perhaps even bishops and members of the clergy. But during the course of the 13th century, this hat came to be more and more closely associated with Jews, and by the 14th century it had almost become a symbol of the Jewish community,” says Professor Baumgarten. Jewish people who chose to wear this and other garments associated with their faith were openly displaying their religious identity, while others were more circumspect, which Professor Baumgarten says raises some interesting questions. “Would people rather

Lovers holding hands. Silver plate from the Weißenfels treasure. 1st quarter of 14th century. Moritzburg Gallery in Halle. Registration number: Mo_LMK-E-164. (Photo by Dr. Ido Noy).

MS London, British Library Add. 11639, Fol. 226v.

not stand out? Or would they rather advertise the fact that they’re Jewish?” she asks. An individual’s dress was important at the time as a way of indicating status, wealth or standing in society, and different organisations like convents and monasteries also had their own distinguishing dress, such as certain belts or habits. Some scholars have argued that this willingness of Jewish people to display their cultural identity through dress was related to the relative abundance of resources at the time. “This was a time of plenty, and Jews could afford to manifest who they were and what they believed in,” outlines Professor Baumgarten. This relative plenty was accompanied by a degree of intolerance, yet there was still space for cultural exchange. “Scholars have described this period as both a time of change and as a time of growing intolerance. Still, we have records of Christian scholars sitting with Jewish scholars and studying the Bible for example. And we certainly see cultural exchange, on the level of the stories that we see, written in Latin or the vernacular,” says Professor Baumgarten. The Jewish and Christian faiths share many common foundations, including certain biblical teachings, yet by the Middle Ages they were distinct from each other. Much was still shared between them however, and certain elements of Christian practice were adopted by Jews over time, a topic of great interest to Professor Baumgarten. “It’s fascinating to

Ritual bath in Friedberg (Photo by Dr. Ido Noy).

www.euresearcher.com

63


BEYOND THE ELITE Beyond the Elite: Jewish Daily Life in Medieval Europe

Project Objectives

Beyond the Elite is a multifaceted research project that seeks to explore what daily life was like for the Jews of northern France and Germany (Ashkenaz) from 1100 to 1350. Beyond the Elite seeks to understand and describe the daily rituals, the objects and the spaces that made up the lives of the members of these communities living within Christian society.

Project Funding

This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme under grant agreement No 681507

Contact Details

Principal Investigator, Professor Elisheva Baumgarten Rabin Building Room 4007 The Hebrew University Mount Scopus Campus Jerusalem T: +9 72 2 588 0424 E: BeyondtheElite@mail.huji.ac.il W: https://beyond-the-elite.huji.ac.il/book/about

Professor Elisheva Baumgarten

Photo by Hila Shiloni

Elisheva Baumgarten is the Yitzchak Becker Professor of Jewish Studies and teaches in the Departments of Jewish History and History at the Hebrew University of Jerusalem. She is a social historian who studies the Jews of medieval northern Europe. She has published on family life, rituals and everyday religious observances of medieval Jews.

see how these developments took place. We aim to try and understand what elements Jews chose to adopt while this evolution was going on, what they borrowed, adapted, or acculturated from their surroundings,” she says. In many cases, the Jewish people would take elements that already existed in their culture, and meld them with the common practices that they saw around them. “Some practices existed in late antiquity, but had new manifestations in medieval culture. We’ve noticed that a lot of these changes took place over the period between 1100 and 1350.” A good example is the kaddish, a hymn of praises to God, which became part of the prayer for the dead and how they were commemorated over time. The prayer itself was not new, but saying it on a daily basis for the dead became a part of common practice over the course of the 12th and 13th centuries. “Somebody whose mother had recently died would be in the synagogue every day for a year after the death reciting this prayer,” explains Professor Baumgarten. A further dimension of the project’s work involves comparing Jewish communities of northern Europe – mainly in Germany – with those in the Mediterranean. “We are looking at how identity was formed differently in various locations, because of different cultural, political and social constraints,” continues Professor Baumgarten. “The way that Jews interacted with their neighbours is important in each of these locations, while we’re also looking at the internal structures of the Jewish community.” The project’s findings to this point suggest that Jewish communities manifested and reformulated their ideals as part of their daily activities, and that the choices made by their leaders and individuals were related to the nature of the majority Christian culture in which they lived. The team members are probing the mechanisms behind these processes.

MS London, British Library Add. 11639, fol. 542v, Christian calendar in circles.

By looking beyond the elite and problematizing everyday life, Professor Baumgarten hopes that she and her team have opened up a new perspective on Jewish communities. “We hope that we’ve really helped to change the way we think about Jewish communities, and opened up new questions for scholars who are interested in this topic,” she says. The project is coming towards the end of its second year, and the emphasis on collaboration and inter-disciplinary knowledge sharing has led to new insights, which Professor Baumgarten intends to build on further in the future. “This project will not be finished once the funding period is over in terms of doing what we set out to do,” she says. “Many new questions and ideas will open up from it. So I really see it as passageway into a new area of research, opening up new questions and ideas.”

ERC team on trip to Germany, May 2018 (Photo by Dr. Ido Noy).

64

EU Research


© Götz Walter, Biermann-Jung Kommunikation & Film.

Studying the fundamentals of infinite symmetry Groups are of fundamental importance throughout mathematics, and new analytical methods are being developed to gain deeper insights into unresolved questions in the field. Professor Andreas Thom tells us about the work of the GrDyAp project in connecting group theory, functional analysis, and the theory of dynamical systems, which could open up new perspectives on long-standing mathematical problems. The study of symmetry is at the heart of pure mathematics, with researchers building on the existing foundations of group theory to approach long-standing problems in the field. Groups arise as symmetries of objects and are thus of fundamental importance across different branches of mathematics, now researchers in the ERC-funded GrDyAp project are looking deeper into the subject. “The project tries to connect group theory, functional analysis, and the theory of dynamical systems,” says Andreas Thom, Professor of Mathematics at Technische Universität Dresden, the project’s Principal Investigator. This work also touches on several other areas within the pure mathematics field. “My research is solely in pure mathematics, but it brings together different branches of pure mathematics,” continues Professor Thom. A group itself can be understood as a type of algebraic structure consisting of symmetries which can act on an object, like a geometric figure or something more complicated. For example, a cube has 24 symmetries which act on the cube; this may on the surface seem difficult to fully understand, but things get clearer when one looks at the cube in more detail. “Each symmetry of the cube has a concrete meaning. An abstract group cannot be understood directly, precisely because a natural object is missing, on which it acts by

www.euresearcher.com

symmetries (whose symmetries it is). Now, in geometric group theory, such an object is constructed — the Cayley graph,” explains Professor Thom. Symmetries can explain complex behaviour of phenomena, ranging from material sciences to quantum physics to everyday objects, says Professor Thom. “Why is a dice fair? Because it has so many symmetries that make it obvious that each side has equal probability,” he points out.

Infinite symmetry groups The major goal in the project now is to study infinite symmetry groups, with Professor Thom and his colleagues looking to develop

novel methods to approach some of the major challenges in infinite group theory. Groups can be found throughout mathematics, and while significant progress has been made in classifying some types of groups, notably finite symmetry groups, Professor Thom says that infinite symmetry groups are particularly challenging. “Infinite groups are harder to study because one cannot write down a complete multiplication table. One has to understand them by other means, for example via the geometric object whose symmetries they describe,” he outlines. “One such object is the Cayley graph, but frequently there are also even more geometric objects, such as

© Götz Walter, Biermann-Jung Kommunikation & Film.

65


© Götz Walter, Biermann-Jung Kommunikation & Film.

manifolds, or more analytic objects, such as Banach spaces.” A lot of attention in research is now focused on developing new methods to study these infinite symmetry groups, work which brings together elements of different branches of mathematics. For his part, Professor Thom is applying randomization and algebraic approximation techniques in this area. “I have been using randomization techniques to construct laws of groups, special equations that all elements of the group satisfy. In a sense, these are generalizations of the identity x+y=y+x for the integers, but they are much more complicated in form and valid in much more complicated groups,” he outlines. “Algebraic approximation on the other hand is a special interplay between ring theory and functional analysis that I have used in joint work with Guillermo Cortinas (Buenos Aires) to prove Rosenberg’s Conjecture.” This particular conjecture relates to a special question about algebraic K-theory of rings of continuous functions, while Professor Thom and his colleagues are also investigating other interesting problems in the field. One is Dixmier’s Conjecture, put forward by the French mathematician Jacques Dixmier,

66

which centres on the representations of a group on Hilbert space, the most symmetrical object imaginable. “Dixmier’s conjecture is now almost 70 years old and has inspired a lot of research over time. Together with Maria Gerasimova, a PhD student in my ERC group, we have made some progress relating this problem with properties of the Cayley graph of a group, in fact all the Cayley graphs of the group,” says Professor Thom.

are also used to study discrete objects. In fact they are very useful,” outlines Professor Thom. Amenability is an analytic property of the action of a group on a Cayley graph, first described by John von Neumann in the late ‘20s. “Dixmier’s conjecture is about the understanding of amenable groups. Dixmier asked whether all unitarisable groups are amenable,” continues Professor Thom. The converse had already been proved to

My research is solely in pure mathematics, but it brings together different branches of pure mathematics, such as group theory, the theory of dynamical systems, algebraic topology, and functional analysis. The Dixmier conjecture itself is about the understanding of amenable groups, which define an important class of discrete groups, in a sense close to familiar groups like the group of integers or finite groups. Given that discrete groups are more combinatorial in nature, it might be expected that mathematicians would not be keen to use analytical methods like derivatives and real valued functions in this area, yet this is in fact not the case. “In modern mathematics analytical methods

be true, namely that amenable groups can be unitarised. It follows that Dixmier’s questions would really imply that a group is amenable exactly when it is unitarisable, hence unitarisability would be a characterization of amenability, a topic of great interest to Professor Thom. “This would indeed be very striking and I would really like to know an answer to this question,” he says. Alongside group theory, Professor Thom is also investigating the possibility of using other

EU Research


methods to address these types of problems, including approximation theorems. “My focus recently has been more on approximation and the stability of groups,” he outlines. Permutations are symmetries of finite sets. The Gromov conjecture, which states that all groups are approximable by permutations, is a problem of particular interest to Professor Thom. “Stability says that any approximation can be made a true equality. It is very surprising that stable groups exist, and together with my PhD student Marcus de Chiffre - who is also in the ERC group - we have found the first non-trivial examples of stable groups,” he says.

Mathematical concepts The wider goal of mathematics in general, according to Bill Thurston, is to reduce any confusion which still remains around core mathematical concepts and to build a deeper understanding of the interplay between basic notions. While this research is largely fundamental in nature, the hope is that the insights gained will prove useful in other mathematical contexts. “We are always looking for generalizations,” stresses Professor Thom. These novel methods could in future be

applied on other mathematical problems, for example the Bergeron-Venkatesh conjecture. “It might be that at some point in future I will realize that a particular method I previously developed is also useful to understand this conjecture. This has happened with other problems in the past, and I am aware of a lot of maths problems in neighboring areas,” says Professor Thom. Further investigation in these areas would require deep understanding and sophisticated methods, underlining the importance of interdisciplinary research. Professor Thom believes it is important to share expertise and learn about the methods that are being applied in other fields. “It is always fun to work in an interdisciplinary way and to apply methods from one field in another,” he stresses. This could then open up new avenues of research and spark further investigation; one longerterm goal for Professor Thom is to use these novel methods to address the growth of torsion in sequences of lattices in hyperbolic groups, while he says there are also many other unresolved questions. “Group theory and functional analysis are fascinating topics, there are still so many interesting open problems,” he says.

GRDYAP Groups, Dynamics, and Approximation

Project Objectives

The study of infinite symmetry groups is a particularly challenging part of group theory, as most of the tools from the sophisticated theory of finite groups break down, so new global methods of study have to be found. The interaction of group theory and the study of group rings with methods from ring theory, probability, Riemannian geometry, functional analyis, and the theory of dynamical systems has been extremely fruitful in a variety of situations. In the GrDyAp project, Professor Thom and his colleagues aim to extend this line of approach and introduce novel approaches to long standing and fundamental problems.

Project Funding

ERC - Consolidator Grant

Contact Details

Project Coordinator, Professor Andreas Thom Institute of Geometry TU Dresden 01069 Dresden T: +49 351 463-43074 E: andreas.thom@​tu-dresden.de W: https://tu-dresden.de/mn/math/ geometrie/thom/forschung/erc-projekte

Professor Andreas Thom

Andreas Thom is Professor of Mathematics at Technische Universität Dresden, a position he has held since October 2016. is a member of several mathematical societies, including the Max-Planck Institute, ‘Mathematics in the Sciences’ and is also a member of the editorial board of ‘Mathematische Annalen’.

Marcus de Chiffre

Maria Gerasimova

Tiling of the hyperbolic plane (Image source Wikimedia Commons).

www.euresearcher.com

67


Charting the life course of migrants Debate continues over whether citizenship should be viewed as an incentive for migrants to integrate in local communities, or as a reward for doing so. The MiLifeStatus project is probing deeper into the relationship between migrant naturalisation and integration, research which holds clear importance to citizenship policy, as Professor Maarten Vink explains. The post-war period

was marked by large flows of migrants into Western Europe. While this was initially thought to be a temporary phenomenon, over time it became apparent that many migrants didn’t intend to return to their country of origin and instead planned to stay and put down roots in their new home. “Not only did they want to stay, they also wanted to bring their spouse and children, who would grow up here,” explains Professor Maarten Vink, the Chair of Political Sociology at Maastricht University. The question of how to integrate migrant communities into national life has since become increasingly important. “There are cases where the descendants of long-standing migrant communities grow up in countries where they don’t hold citizenship. So they don’t have any political rights, and they haven’t got certainty as to their legal status,” continues Professor Vink. “This is problematic from both a community perspective and the perspective of the individual.”

MiLifeStatus This topic is central to the work of the MiLifeStatus project, an initiative looking into the relationship between migrant naturalisation and integration, something which remains high on the political agenda in the wake of the European refugee crisis. There’s an ongoing political debate in the Netherlands, and indeed many other countries in Western Europe, about whether citizenship should be seen as an incentive

68

for migrants to integrate, or as a reward for doing so. “Some people on the political left argue that citizenship should be seen as an incentive. So it is said that we need to make citizenship accessible for migrant groups, and once they receive citizenship status they will be able to consolidate their lives,” outlines Professor Vink. People on the political right however tend to see citizenship more as a reward for integration. “The most wellintegrated immigrants will be rewarded with citizenship,” says Professor Vink. The underlying assumptions behind these ideological positions have not been properly tested, now Professor Vink and his colleagues aim to build a stronger evidence base in this respect. The project’s research is focused on four countries – Sweden, Denmark, Germany and the Netherlands – each with different rules around migrant naturalisation, starting with residency. “Here we see great variations – there is a five year residency requirement in the Netherlands and Sweden, eight in Germany, and nine in Denmark,” outlines Professor Vink. There is also usually a good behaviour component to acquiring citizenship, yet some countries choose not to put up many other barriers. “The most liberal country in the project is Sweden, where there is essentially no language and integration requirement. So if you have lived for five years in Sweden, are not a security threat and don’t have a criminal record, then you can become Swedish,” continues Professor Vink. There are more stringent integration

requirements in other countries however. Migrants applying for a Danish passport must reach quite a high level of language proficiency for example, while Professor Vink says many countries have citizenship tests, with questions on issues that are seen to be important in terms of integration into national life. “In the Netherlands the questions tend to be more attitudinal, while in Germany there are questions about the constitution and national history,” he outlines. These different requirements have a long-term impact. “It’s really quite remarkable”, Professor Vink says, “as after fifteen years, three-quarters of migrants in Sweden have naturalised, but in the Netherlands only around half and in Denmark and Germany not more than a third.” Researchers are now investigating the extent to which making citizenship more or less accessible gives it a different meaning, looking at data available from each of these four countries. “We use big data from population registers, and we also look at the integration of migrants on the labour market. Most politicians would agree that they expect migrants to be self-sufficient, to contribute to the national economy,” continues Professor Vink. A migrant’s citizenship status is an important factor here, as employers may be wary of offering a job to somebody without permanent residency rights or good language skills. It might therefore seem logical that naturalised migrants are more likely to be in

EU Research


MILIFESTATUS Migrant Life Course and Legal Status Transition Project Objectives

The MiLifeStatus research team will disentangle the relationship between migrant naturalisation and integration in a longitudinal and comparative manner. The central - and innovative - idea of the research is to model migrants’ legal status transitions as life course events, which are in turn shaped by their origin, their family context, and societal structures and institutions..

Project Funding Naturalisation ceremony, Maastricht, The Netherlands.

secure, paid employment and have a higher income, something which is borne out by the project’s research so far. “We see that those migrants who have acquired citizenship generally do better on the labour market,” explains Professor Vink. This is to some extent a chicken-and-egg type question, so Professor Vink and his colleagues are looking into this in more depth to tease out the causal effects. “Does acquiring citizenship lead to better integration? Or are these better-integrated immigrants more likely to naturalise anyway?” he continues. “We use longitudinal data, where we track migrants over time. We can also compare their labour market status before and after naturalisation.”

Citizenship and integration The wider backdrop to this research is the ongoing debate about the relationship between citizenship and integration. Researchers aim to contribute to this debate and help assess the likely impact of policy ideas, such as a proposal by a previous government in the Netherlands to extend the residency requirement for naturalisation from five years to seven. “The idea was that migrants would be better integrated after seven years,” explains Professor Vink. This proposal has since been shelved, partly drawing on the research of Professor Vink and his colleagues. “We were able to contribute to the debate by showing that this proposal was likely to

We see great variation in how many migrants become citizens. After fifteen years, three-quarters of migrants in Sweden have naturalised, but in the Netherlands only around half, and in Denmark and Germany not more than a third. This allows researchers to identify whether these integrated migrants who are doing well on the labour market are also those who are more likely to naturalise. To some extent the process of naturalisation can incentivise migrants to acquire skills that might also benefit them on the labour market. “We found that naturalisation encourages migrants to do well on the labour market,” outlines Professor Vink. This effect is heightened when a migrant acquires citizenship relatively soon after they have completed the residency requirement rather than later on, when maybe their career trajectory is more defined. “The effect of naturalisation to some extent precedes the acquisition of citizenship, because migrants have invested in their own skills by acquiring language capability and learning about the country they’re living in,” says Professor Vink.

www.euresearcher.com

have counter-productive effects, as making migrants wait longer for naturalisation might actually dis-incentivize them, and diminish the potential for effective integration,” he outlines. “We want to contribute to ongoing political debates, and to help politicians make more reasoned decisions.” This work is ongoing, with the project currently in the third year of a five year funding term. Researchers are currently analysing the data and drawing comparisons, looking to gain deeper insights about the situation in Western Europe. This could form the basis for wider comparative research in future, for example looking at data on citizenship from North America. Professor Vink intends to bring this work to wider attention. “I have even taken up tweeting. We want to publish our results in high-ranking peer-reviewed journals, but we want to make sure that we also share those results with the broader public,” he says.

MiLifeStatus has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 682626).

Project Partners

MiLifeStatus collaborates with researchers from Malmö University, Lund University and the Danish Human Rights Institute (see: https://www.milifestatus.com/team).

Contact Details

Project Coordinator, Professor Maarten Vink Professor of Political Sociology Department of Political Science, Faculty of Arts and Social Sciences Maastricht University T: +31(0)43 388 3376 E: m.vink@maastrichtuniversity.nl W: https://www.milifestatus.com/

Professor Maarten Vink

Maarten Vink is Professor of Political Sociology at Maastricht University, where he is Co-Director of the Maastricht Center for Citizenship, Migration and Development (MACIMIDE). Vink leads the 5-year research project ‘Migrant Life Course and Legal Status Transition (MiLifeStatus)’ funded by a Consolidator Grant of the European Research Council (2016-2021). He is Co-Director of the Global Citizenship Observatory (GLOBALCIT)

69


A glimpse into the past of domestic service in India Domestic service was an important category of labour and social relationships in early modern and colonial India, yet it has been relatively neglected in research. We spoke to Dr Nitin Sinha, Dr Nitin Varma, Sourav Kumar Mahanta and Vidhya Raveendranathan about the work of the DOS project in investigating the master-servant relationship historically, and clarifying its importance for wider social and labour histories. Behind the curtain of domestic service in India

Lady Impey supervising her household, Calcutta, 1777-83.

Domestic service is a ubiquitous aspect of India’s historical past, yet this theme has attracted very little research attention. Researchers in the DOS project are taking a new and fresh look at this topic, focusing on the period between the 18th and 20th centuries, approximately the time of British colonial rule in India. “We wanted to start the project from this time to try and build on the work that had been done on earlier and later periods,” says Dr Nitin Sinha, the project’s Principal Investigator. While the project is focusing on this particular period, Dr Sinha says it is also important to be aware of the preceding events and trends that led up to it. “In the project we have been in contact with scholars who worked on the early modern period, from the 16th to the 18th century, and we have solicited some contributions and papers from that period, which will appear in our published edited volumes and special journal issues in the near future,” he outlines.

70

EU Research


There are examples of Indian households in the 15th and 16th centuries keeping domestic servants, many of whom served in the households of Hindu merchants, Mughal nobility, and a variety of other households, both the privileged and those of lesser means. What was new in the colonial context was that the occupational fluidity and mobility that servants had enjoyed in the pre-colonial period gradually petered out, to be replaced by a more formalised relationship. From the mid-18th century, a series of regulations were passed which defined the relationship between the master and servant as a contractual relationship. This is evident in work on Bengal presidency done by Dr Sinha and his colleague Dr Nitin Varma, and also research on Madras by Vidhya Raveendranathan, a PhD student who is also working on the project. The series of regulations – some of which failed, some of which were successful, and some of which had long-lasting unintended effects on the nature of domesticity, property and criminality – comprised the essence of masterservant laws as they operated in the colony. Researchers are now building on these foundations to look in greater depth at domestic service in India, which is central to wider social, cultural and labour history. Labour history in India has largely centred on the development of industry and productive labour, while in comparison domestic service has been seen as relatively incidental and marginal, an imbalance that Dr Sinha and his colleagues aim to address. “Our project is trying to address that gap,” he says. Researchers are working with a range of textual and visual material, including state departmental records, court records, censuses, surveys, private papers, diaries, memoirs, advice manuals, vernacular literature, paintings and photographs, through which they aim to build a deeper picture of domestic service at this time. “For instance, we have looked at court cases and proceedings, which is challenging as its structure can be highly routinised; servants’ testimonies as witnesses can become formulaic. They are useful but they demand cautious treatment as

they do not readily represent the ‘authentic’ voice of the subordinate,” outlines Dr Sinha.

Master-servant relationship The relational nature of the term ‘servant’ is itself a topic of great interest in the project, as while it implies subservience to the master or mistress of the household, servitude itself can take different forms. Domestic service, servitude and forms of domestic slavery overlapped. With female servants and slaves, the question of reproductive labour and kin-making became crucial. While fully aware of the slave-servant overlap, the project argues that the expansion in domestic service was related to the firmer contractual basis of the master-servant relationship. This, in turn, was premised upon – and reflected the expansion in – the domestic labour market itself. The project addresses some crucial questions: “How did masters and mistresses command servants beyond the institutional framework of law? What role did the use of language, gestures and speech play in the making of this relationship? Did objects such as liveries and badges matter? How could commodities such as wine and silver cutlery, which were often allegedly stolen by servants, help us in knowing the everyday relation between masters and servants? What forms of ‘complicity’ can one detect between private control exercised by masters and the state’s formal institutions of governing domestic labour,

... that’s why, stick.

www.euresearcher.com

crime and market? What were the forms of overt and everyday resistance by the servants?” asks Dr Sinha. “We can look more deeply into this when we aggregate different sources, and assess how it changes over time.” There may have been a wide variety of roles within a household, maybe in the stables or the kitchen for example, depending on the size of the household and the outlook of the master or mistress. Evidence suggests that during the early part of the 19th century, Europeans in India had established a hierarchy among servants, in part because of the size of the domestic staff. “It was thought that a European gentleman’s household should have around 28-30 servants – out of which 7 or 8 would be seen as upper grade servants, and the rest as lower-grade servants,” says Dr Varma. With a staff of this size, it was necessary to delegate responsibility to one person. For instance, for all the bearers in the household there would be a ‘head bearer’. “We can see that one person was responsible and was put in charge of a task,” says Dr Sinha. “The working of colonialism required the simplification of relationships.” This extended to recruitment, as while the master may have selected one or two servants, those people were then themselves responsible for hiring the rest of the staff. They were also responsible for the performance of those servants. Violence was often used to exert discipline. “Punishment was not just verbal, there are instances of brutal violence, almost at the everyday scale,” says Dr Sinha. A master would not necessarily send an erring servant to court or ask the authorities to investigate, but rather

A group of Madras servants, 1870.

71


DOS Domestic Servants in Colonial South Asia

Project Objectives

A study of domestic service and servant opens up the intimate history of social relationships and explains the long history of ‘legally constituted but formally absent’ history of regulation of domestic work, which is a matter of global concern in present times. Master-servant relationship lay at the heart of the making of social identities, hierarchies, household relationships and the processes of state formation.

Project Funding

ERC funded three year project (2015-18).

Project Partners

• Leibniz-Zentrum Moderner Orient, Berlin • IGK, re:work, Humboldt University, Berlin

Contact Details

Project Coordinator, Dr Nitin Sinha Leibniz-Zentrum Moderner Orient Centre for Modern Oriental Studies Berlin T: +49-(0)30-80307-113 E:nitin.sinha@zmo.de W: https://servantspasts.wordpress.com/ Dr Nitin Sinha

Dr Nitin Varma

order other members of staff to lash them. “This was very common. A lot of the violence wouldn’t even be reported in the courts – it was only if the person was seriously injured that it would become a judicial matter,” outlines Dr Varma. “Only the more brutal acts of violence are documented in court records. We work primarily through documents that are available in the archive, so we tend to see the more extreme examples.” The nature of the master-servant relationship began to change over the latter part of the period, as cases of violence towards servants attracted increasing attention, while social expectations and behaviour also shifted in a rapidly changing political economy. The distinction between upper and lower grade servants started disappearing around the late 19th century with the consolidation of the identity of domestic servants, a time when Dr Varma says a new dynamic emerged. “The upper grade servants were now seen more as professional service providers, while the lower grade servants became the archetypal domestic servant. This was a distinct period of consolidation of ideas and practices attached to meniality, stigma, purity and work in the figure of the domestic servant. This marked the emergence of the category of servants as social marginals,” he says. Class distinctions, and caste identity, are important considerations in

European system of classification of occupation is suitable for the Indian context. Here, one line of dissenting opinion which emerged was that the adoption of the English system – the Dr Farr method – led to the under-enumeration/negation of women’s work.”

Rewriting social and labour history The initial goal in research is to build a comprehensive account of the different practices that defined and mediated the relationship between masters and servants, which will open up new insights into India’s social and labour history. The master-servant relationship can serve as a template to understand some of the ambivalences and tensions that shaped the making of the colonial empire in South Asia, believes Raveendranathan. “The focus in the project has been on unearthing the quotidian aspects of law making and the servant’s role in shaping masterservant laws. Furthermore, it has also opened up the master-servant relationship, to talk not merely about domestic service, but also a wide range of labour relations that existed in colonial India,” she outlines. “Many of the insights of the project can be used to make inferences about a set of anxieties that shaped the process of colonialism.”

A study of domestic Dr Nitin Sinha is a Senior Research Fellow at Leibniz-Zentrum Moderner Orient (Centre for Modern Oriental Studies) in Berlin. He gained his PhD from the School of Oriental and African Studies (SOAS) in London in 2007 and has since held teaching and research positions at institutions in both Germany and the UK. Dr Nitin Varma is a Senior Research Fellow at IGK, re:work, Humboldt University in Berlin. He studied history at the University of Delhi and JNU (Delhi) and later received his doctorate degree from Humboldt University in 2011. Project Members Heena Ansari, (January-June 2017). Sourav Mahanta, (October 2017-July 2018). Vidhya Raveendranathan, (November 2017July 2018).

Servants’ Pasts, 2015-2018 72

service and servants opens up the intimate history of social relationships and explains the long history of ‘legally constituted but formally absent’ history of regulation of domestic work, which is a matter of global concern in present times. this respect. “By the late 19th century, meniality became a function of caste to a greater degree and the work of some servants associated with ritually impure and dirty work (sweepers, for instance) became further stigmatised and devalued”, explains Dr Varma. In turn, stigma heavily defined domestic work itself. The devaluation of paid domestic work was also occasioned by other processes of feminisation and casualization, to a point where the part-time female domestic servant was the most representative example of domestic service by the end of the 20th century. There are also other important sources of insights into the lives and experiences of servants during this period, including census records. As a research assistant on the project, Sourav Kumar Mahanta spent some time examining census records, and found files specifically on the classification system for different occupations. “These files give us details about internal debates and differences of opinion between the Census Commissioner and provincial census superintendents”, he outlines. “One interesting aspect of these debates was the extent to which the adoption of the English/

Further, the project aims to steer away from the narrow focus of legally-inflected labour history and instead shift focus towards the domain of the everyday in which objects and materials (coats, badges and liveries), commodities and things (wine and arrack), and language and command were equally important. A servant was valued as a marker of class status, but also feared for their subversive potential as they they occupied the private and intimate domains of the household, so the nature of the master-servant relationship can to some extent be seen as indicative of wider trends. This was the case not just in India, but other countries in the British Empire, a topic that the project will explore through a series of publications based upon individual research and collaborative initiatives. “We are trying to go beyond our regional focus on India, and have put forward a joint publication proposal on the theme of regulating domestic service across the British Empire. So we are in touch with scholars working on Hong Kong, South Africa, and Australia, while we are working on India,” he outlines.

EU Research




Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.