EU Research Winter 2012

Page 1

EU Research Winter 2012 Issue

JPND

“a European initiative to tackle one of society’s Grand Challenges”

The European Science Foundation: Social Science Strategy From Latin America to Europe: an International view on Climate Change and Energy.

The latest research from FP7 Highlighting The Leading Dissemination for EU Research


Images: Martyn Poynor and Mandy Reynolds


Editor’s No I Martin Brodie runs his own media and communications company, after a career in journalism and public relations. After starting his journalist career on the Bristol Evening Post , Martin joined Rolls-Royce plc in Bristol in1980 and held a number of senior communications positions in the defence and civil aerospace businesses before moving to the group’s corporate headquarters in 1991. From then until his retirement in 2009, Martin was responsible for much of the Group’s international communications and was also involved in UK national media relations throughout this period. His last role was Head of Communications – Europe Middle East and Africa. His previous roles also included responsibility for engineering and technology, during which he was editor of the first Rolls-Royce environment report, Powering a Better World.

www.euresearcher.com

f countries looking to improve their medal tallies in the Rio de Janeiro 2016 Olympic and Paralympic Games performances are to stand any chance of achieving better results than in London 2012, then they really ought to have already begun investing in the necessary research. Because if one thing has become clear from London 2012, it is that countries that have applied the results of such research tended to achieve better results. Of course, this does not always happen and the performances of individual athletes can defy predictions. Yet there is a big debate under way in Australia, which under-performed with seven gold medals in London, as to why the ‘Australian Sport - The Pathway to success’ document published by the Australian Government in 2010, which stipulated that ‘innovation, research, science and technology will continue to be drivers of Australian sporting excellence in the coming decades’ did not translate into medal success. It is also interesting to note that virtually half of the United States’ haul of 104 Olympic medals were won by athletes and swimmers. Interesting but not surprising, bearing in mind the importance of the collegiate sports network in the US. So, perhaps, it was the performance of Great Britain, the leading European country, that offered that best example of how applied innovation, research, science and technology can translate into a better-than-expected medal haul. Indeed, such was the scale of cycling success that some countries suspected Team GB of using special wheels just for Olympic Games - and hiding them away in between races. However, one has only to look at the work of Research Councils UK (RCUK) to see that universities throughout the UK contributed much to the enhancement of both training and performance in key sports. And Team GB has always been sure to thank the funding from the UK National Lottery, distributed by various sporting bodies, for its contribution to success in London. Continued funding into sports science and sports engineering will be essential if Team GB and Paralympics GB want to hold on to their medal positions in 2016. Of course, the almost measurable impact of home support will be missing in Rio - but properly-applied research and technology can go a long way to replacing that Olympic roar. Hope you enjoy the issue.

Martin Brodie

1


Contents

4 Interview with the European Science Foundation

We speak to the Head of the Humanities and Social Sciences Unit at the ESF, Dr Nina Kancewicz-Hoffman

8 SeeBetter Project coordinator Dr David San Segundo Bello tells us about his research groups work to design and build a highperformance silicon retina

11 Selfchem Professor Nicolas Giuseppone and the driving forces behind efforts to design smart selfassembled systems with specific new functionalities

14 ENABLE Developing a unified definition of vulnerability for use by healthcare professionals with Professor Samia Hurst

17 EMERGE Professor Rein Ulijn explains about addressing some of the most pressing challenges facing society

20 Liver Cancer

Mechanisms Professor Mathias Heikenwälder speaks about his research into the mediators that induce chronic inflammation, and how it then develops into liver cancer

2

23 SMINSULATOR

Investigating the structure and regulation of chromatin, the combination of DNA and proteins that forms the nucleus of a cell with Dr Marcelo Nollmann

26 Heart-e-Gel Using polymer materials to treat cardiovascular conditions in a less invasive way than currently deployed will boost both patients’ life expectancy and quality of life. Project Coordinator Frank Stam and Professor Paul Herijgers explain

29 EU Joint Programme

on Neurodegenerative Disease Research (JPND) Professor Phillippe Amouyel on the scale of the challenge of finding a cure for neurodegenerative diseases

34 The ACIL Professor André Nollkaemper, founder of the ACIL, Dr Hege Elisabeth Kjos, Assistant Professor at the University of Amsterdam, and Drs Martine van Trigt, Management Assistant at the ACIL, discuss their research on International Law through the National Prism

36 Statebglatamerica Juan Carlos Garavaglia of the Statebglatamerica project explains about the study of the process of state-building and the invention of the nation in Hispanic America

39 EMEC

The European Marine Energy Centre (EMEC) Ltd talk about the extraction of renewable energy from the waves and tides of European coastlines

43 NAMASTE We spoke to scientific coordinator Professor Valerio Lucarini about the project’s research developing new theoretical approaches and new diagnostic tools to analyse the structural properties of the climate system

47 EUROSPEC We spoke to project coordinators Drs Alasdair Mac Arthur, Enrico Tomelleri and Albert Porcar-Castell about their ongoing work in developing optical instrument specifications, techniques and protocols to standardise optical measurements

50 Desert Storms Professor Peter Knippertz talks to EU Research about Desert Storms, an ERC funded project which hopes to revolutionise the way mineral dust emissions are treated in numerical models of the Earth system

53 LIGNODECO Jorge Colodette, the coordinator of the project explains about their initiative aiming to develop new wood pre-treatment techniques

EU Research


56 Bioelectronics The concept of a genuine synergy between manmade Electronics and natural bodily systems – was once the stuff of science fiction. But, no longer

60 Dirac fermion materials Professor Oleg Yazyev of the EPFL in Lausanne, Switzerland speaks to us of the many challenges that still remain before graphene can be widely applied

62 ENNSATOX We spoke to Andrew Nelson, Dave Arnold and Karen Steenson about their research into the structure and function of nanoparticles, and how this could lead in to regulation of the nanoparticle industry

65 Centre for Sustainable Development

EU Research’s Richard Davey investigates the Centre for Sustainable Development, Mexico’s answer to the question of global climate change

69 PAGAP Dr Francis Brown of the PAGAP project explains how the exchanging of ideas and techniques will benefit researchers in both physics and mathematics disciplines

72 FLEXWIN The development of smart radio frequency microsystems, using a highly innovative technology platform to realise re-usable, re-configurable and multifunctional circuits. Volker Ziegler explains

74 InfoMacro Christian Hellwig spoke to us about his research into how lack of information about aggregate economic conditions influences economic outcomes

77 Crossover David Osimo explains how technology and social media can be harnessed to build a new knowledge base that takes full account of human behaviour

80 GEQIT We spoke to Professor Renato Renner, coordinator of the project about their work to develop a new, generalised theory of classical and quantum information

82 iSPACE By developing personalised climate systems the iSPACE project aims to give passengers individual control over humidity, temperature and airflow at seat level, as project coordinator Dr Gunnar Grün explains

EDITORIAL Managing Editor Martin Brodie info@euresearcher.com Deputy Editor Patrick Truss patrick@euresearcher.com Deputy Editor Richard Davey rich@euresearcher.com Science Writer Holly Cave www.hollycave.co.uk Acquisitions Editor Elizabeth Sparks info@euresearcher.com PRODUCTION Production Manager Jenny O’Neill jenny@euresearcher.com Production Assistant Tim Smith info@euresearcher.com Art Director Daniel Hall design@euresearcher.com Design Manager David Patten design@euresearcher.com Designer Susan Clark design@euresearcher.com Illustrator Martin Carr mary@twocatsintheyard.co.uk PUBLISHING Managing Director Edward Taberner etaberner@euresearcher.com Scientific Director Dr Peter Taberner info@euresearcher.com Office Manager Janis Beazley info@euresearcher.com Finance Manager Adrian Hawthorne info@euresearcher.com EU Research Blazon Publishing and Media Ltd 131 Lydney Road, Bristol, BS10 5JR, United Kingdom T: +44 (0)207 193 9820 F: +44 (0)117 9244 022 E: info@euresearcher.com www.euresearcher.com © Blazon Publishing June 2010

Cert o n.TT-COC-2200

www.euresearcher.com

3


Social sciences;

a broad field with wide importance The social sciences are not only important in terms of understanding the roots of human behaviour, but are also relevant to major contemporary challenges. We spoke to the Head of the Humanities and Social Sciences Unit at the European Science Foundation, Dr Nina Kancewicz-Hoffman, about social science research and its wider importance

S

ocial science is a broad field, encompassing everything from psychology and cognitive sciences right through to social anthropology and business studies. Europe has long played a pioneering role in these fields, with researchers helping build a better understanding of human behaviour, creativity and social organisation.

This is not to say that social science research is less relevant today, quite the opposite in fact given the international, interdisciplinary nature of major global challenges like climate change. We spoke to the Head of the Unit, Dr Nina KancewiczHoffman, about the ESF, the social science research it funds and develops, and the wider importance of its work.

This research is not only important for its inherent academic value, but can also have an impact on the way we live and work. Indeed, social science research, alongside technological and scientific development, is crucial to the future of the European economy and our ability to deal with emerging issues.

Bottom-up research activities

Challenges like climate change, energy sustainability and providing for our ageing population may seem far removed from social sciences, but in addressing them it’s crucial to understand the social context. The scale of these challenges puts them beyond the scope of any single nation, so a coordinated, cross-cultural approach to research is required. The work of the European Science Foundation (ESF) takes on enormous importance in this context. Established in 1974, the ESF covers all scientific fields as part of its aim of promoting high quality European science, managing activities across 11 disciplinary areas, including the humanities and social sciences. The ESF’s Humanities and Social Sciences Unit funds and develops research across a range of areas which seek to examine and explain human behaviour. However, the ESF is currently in a period of transformation and most of the unit’s usual instruments, including bottom-up calls and standing committee strategic activities, will be very limited in future.

4

EU Researcher: Does the Foundation focus on any particular area of the social sciences, or do you have quite a broad remit? Dr Nina Kancewicz-Hoffman: The ESF has two types of activities. Some are bottom-up activities, where we announce calls without prioritising specific areas and accept the scientifically best proposals, independent of the topic they address. On the other hand, each Scientific Standing Committee has a budget for strategic activities on specific strategic issues, topics, or subfields which the committee identifies and defines as needing more development. This ensures a balance between the curiositydriven and policy-driven research agendas. For example, in 2011 the SCSS (Standing Committee for the Social Sciences) funded a strategic workshop on future directions for research on migration issues, as this is a strategically important topic. EUR: What kind of issues would that research cover? Dr K-H: The ESF’s activities focus on issues which require panEuropean comparison, and which need to be looked at from the point of view of all European countries, or at least a large group. Migration is by its very nature an international phenomenon.

EU Research


Research on migration is flourishing but is very fragmented over disciplines, perspectives and issues. Disciplines like political science, law, economics, sociology, demography, etc., have their own approaches to studying different aspects of migration at different levels, from the individual right up to the social system, economic system or region. This makes it difficult to achieve a really comprehensive understanding of the relevant phenomena and drivers without adopting an international, interdisciplinary approach.

The best example of the benefits of Europe-wide collaborations is the European Social Survey (ESS), an absolutely unique undertaking, supported from its beginnings in 2001 by SCSS. The ESS is a Europe-wide (covering currently more than 30 countries), academically-driven social survey designed to chart and explain the interaction between Europe’s changing institutions and the attitudes, beliefs and behaviour patterns of its diverse populations. The result is comparable social data. This is a really unique source of social science information.

The strategic workshop, which was attended by participants from all over Europe and the US, therefore aimed to put all these theoretical and methodological approaches in relation to each other, so as to draw up a theorydriven agenda for multi-method and multilevel research on migration processes that would ultimately help us improve our understanding of this very complex social phenomenon. Part of the challenge is to make better use of existing data, but also to collect internationally comparable quantitative and qualitative data.

Research projects

EUR: Is this kind of collaboration particularly important given the scale of the social challenges Europe faces? Dr K-H: Yes, this is absolutely clear. If we want to compete with large countries like the U.S or China, we need European collaboration, because individually not even the largest European countries would be an equal partner to the U.S.

We can exchange and share useful data and experiences from different cultures. This is especially important for humanities and social sciences, because they are closely related to the environment in which they function

This covers all research areas, but of course our focus is on the social sciences. The role of the ESF in these terms is to enable pan-European collaboration, which is similar to the European Union framework programme, but there are differences. First of all, our instruments work on a bottom-up approach. The framework programme, except for the European Research Council, is organised around priority themes. Also, the ESF includes member countries from outside the EU, such as EFTA countries, the Balkan nations and Turkey.

www.euresearcher.com

EUR: Could we talk about some of the specific projects funded by the ESF? I understand the EUROHESC project is looking at Higher Education and social change? Dr K-H: The EuroHESC programme followed on from our Forward Look on Higher Education. A Forward Look is a foresight instrument outlining the directions in which a particular research domain should go. Research on higher education is an applied domain. Normally researchers in this field are under pressure to deliver advice and factual support for political decisions.

What the researchers wanted to do in this foresight exercise with ESF was to look at their field independently of pressures from policy-makers. They worked on the directions which need to be developed in research on higher education, both theoretical frameworks and Europe-wide comparative approaches. The EuroHESC projects that took up this research agenda worked in particular on issues concerning changes in the steering and governance of higher education and the implications of this for the academic profession and scientific innovation – and all of this with a European comparative approach. The projects are now producing results, but one fundamental

5


finding is that simplistic views about global challenges and policy solutions regarding higher education and research really need to be counterbalanced by a better understanding of the complexity of the situation in Europe and the world. That is, despite common pressures and trends in higher education systems, very substantial differences persist between countries.

This is an excellent example of how close collaboration between humanities, social sciences and medicine can contribute to understanding important social challenges. When developing personalised medicine it is necessary to take into consideration the potential ethical consequences, the economic costs and how it will function in society.

This is very important, not least because we want higher education staff, students and knowledge to be able to move from country to country. For this we need to know and understand what is happening in national higher education systems.

We’re also very interested in creating the conditions for collaboration between the humanities and social sciences and other areas of research. We therefore also focus on the issue of inter-disciplinary collaboration as such. We recently had several projects which looked at how to facilitate such collaborations and how to prepare young researchers for inter-disciplinary work. In the workshop ‘The Good, the Bad and the Ugly – Understanding collaboration between the social sciences and the life sciences’ researchers discussed conditions necessary for such collaborations to be fruitful.

EUR: Why do you see this kind of mobility as being important? Dr K-H: If we want research to progress efficiently we cannot afford to study the same research issues in every country. Mobility of researchers and doctoral students helps to avoid duplication and fragmentation. Europe-wide networks or European excellence centres attracting researchers from all over the world and focusing on different areas will enable Europe to compete more effectively. Also international collaborations enable us to exchange and share useful data and experiences from different countries. This is especially important for humanities and social sciences, which study cultural and social contexts. EUR: Can social science research also have an impact on other areas of the ESF’s work? Is there a lot of potential for interdisciplinary collaboration? Dr K-H: The ESF is a very interesting place from this point of view, as it covers a wide range of research areas, from physical and engineering sciences all the way through to social sciences and humanities.

The foundation has two types of activities. Some are bottom-up activities, where we announce calls without prioritising specific areas. On the other hand, each Scientific Committee has a budget to focus on specific strategic issues which the committee identifies and defines

This means that in ESF we are very well placed to pursue interdisciplinary research connecting seemingly distant disciplines. I would say that social sciences are an equal research partner of ‘hard’ sciences in the ESF.

We have been actively developing interdisciplinary research in several areas over recent years – one of them is global environmental change. The ESF Forward Look RESCUE was an attempt to further research in this area by integrating humanities and social sciences with philosophy, history, economics and sociology. Another ‘hot’ area is research on health systems, where there is a need to combine medical research with work on ethical and philosophical issues, as well as relevant social and economic issues. The ESF will publish a report on the future of personalised medicine in the coming months.

6

EUR: Do you see this approach as being important to advancing the scientific agenda?

Dr K-H: Yes, but in our experience it should start early. We often conclude our activities by saying there should be some kind of training for young researchers, preparing them to collaborate with people from other research areas. We organised recently, on the initiative of the Standing Committee for the Humanities and involving all other ESF Standing Committees, a training event ESF Junior Summit ‘Water: Unite and Divide. Interdisciplinary approaches for a sustainable future’ where over 36 early career researchers discussed the challenges and opportunities posed by inter (multi, trans) disciplinary research. Rather than establishing a theoretical discussion at a meta-level, the event addressed the specific topic of ‘Water: Unite and Divide’ so that the discussions were able to focus on concrete issues relating to the participants’ own research interests and experiences. The conviction that interdisciplinary research is a key to advancing the research agenda is also affecting the way we organise peer review of proposals submitted to us. We always make sure proposals concerning different research areas are evaluated by representatives of those areas. So if we have a proposal on the economic aspects of medicine for example, we make sure that representatives of both the medical community and social sciences give an opinion and are involved in the decision. EUR: What is the role of social science in those kinds of collaborations? Dr K-H: Understanding the societal context is absolutely necessary to dealing with major challenges.

EU Research


If you think about energy, there is a certain social context, in terms of producing, using and saving it. To deal with the issue it’s not enough just to have a new source of energy – it’s important to also understand the economic and social context.

Encouraging innovation EUR: Does inter-disciplinary research also help promote technological development? Can you create a culture that encourages scientific and technical innovation? Dr K-H: Of course social studies like the examples mentioned above on global environmental change or on personalised medicine should help technological development. ESF SCSS has however more modest but, nevertheless, challenging goals. One of our aims is to show policy-makers that social sciences can be used to inform policy decisions. If policy-makers use social science research to inform their decisions they may take more efficient decisions which bring positive results. For example by organising together with European Parliament STOA (Science and Technology Assessment panel) a conference on ‘Science of Innovation’ we aimed at making policy-makers more sensitive to studies on innovation, which indicate that some of the models of supporting innovation promoted in the framework programme are a little bit simplistic.

creation of the European Research Area and even more so with recent efforts to position Europe as an innovation space by 2020. EUR: Does that further reinforce the importance of collaboration and working together with international partners? Dr K-H: Yes, absolutely. I think that one of the strengths of the way we work – especially having a representative of each country on each committee – is that we are able to bring different European regions together. Exchanging experience, information and working methods is very important, and where participation is uneven it’s even more important. We always make an effort to include people from regions where the discipline is maybe less well-established, so that they can then encourage its future development.

www.esf.org

EUR: To what extent is the research agenda shaped by current events? Is it driven by current social priorities? Dr K-H: In our bottom-up approach, social science researchers often come up with topics which are a reaction to recent developments, such as migration, higher education, energy and the environment which are inter-disciplinary in scope. It is our experience that bottom-up proposals are by nature just as likely to address issues that are important to society as they are to address strategic considerations. With these strategic considerations maybe we are looking a little bit more towards the long-term, in terms of the development of whole research subthemes or directions. EUR: This includes looking at issues as big as the future of science in society? How are you taking that work forward? Dr K-H: The ESF started to discuss this issue at the initiative of our member organisations which are concerned that there is a lack of understanding between the research community and society at large. Our Member Organisation Forum recently published a report; ‘Science in Society: a Challenging Frontier for Science Policy’. The report aims to encourage better practice in the relationship between science and society. In parallel the ESF SCSS has initiated a strategic activity on ‘The Future of Science in Society’ that will produce a Science Policy Briefing. The briefing will deliver an account of the different sites where science/technology/society relations take shape, looking beyond the traditional SiS debates around the public. For how science, technology, innovation and society are intertwined and governed in Europe is undergoing quite a substantial change with the

www.euresearcher.com

Dr Nina Kancewicz-Hoffman, Head of the Humanities and Social Sciences Unit at the European Science Foundation

7


A new vision of image sensors Researchers from the SeeBetter project are drawing inspiration from the biological retina as they seek to make improvements on current state-of-the-art image sensors. Project coordinator Dr David San Segundo Bello tells us about his research groups work to design and build a high-performance silicon retina using advanced photodetector technology Studies of the

biological retina offer important insights for researchers seeking to improve the performance of image sensors. A lot of research has been done over the last few years on temporal image sensors and materials that mimic the most basic function of the mammalian eye, but conventional image sensors remain relatively limited in comparison to biological retinas, an issue the SeeBetter project is working to address. “The overall goal of the SeeBetter project is basically to improve on the current state-of-the-art of

rather to learn from it and take those aspects which can be implemented on an electronics circuit and will add functionality to image sensors. Currently image sensors work very differently to the mammalian eye, so there is a lot of room for modification. “Conventional image sensors accumulate light by using what is called the shutter time, or exposure time. The light interacts with a type of matter, typically a semi-conductor – a charge is then generated that is proportional to the intensity of the light. This charge is then

We want to take the work that is being done by one of our partners,

FMI in Basel. They are the biological partner of the project and are doing a lot of biological measurements and tests on

mammals to understand how the retina works. We want to extract those aspects that are easier to implement on an electronics circuit image sensors, using information about how the biological retina works,” outlines Dr David San Segundo Bello, the project’s overall coordinator. The project is working to design and build a high-performance silicon retina, building on research into the biological retina. “We want to take the work that is being done by one of our partners, FMI (Friedrich Miescher Institute for Biomedical Research) in Basel. They are the biological partner of the project and are doing a lot of biological measurements and tests to understand how the retina works,” continues Dr San Segundo Bello. “From all the new things they are discovering, we want to extract those aspects that can be easily implemented on an electronics circuit.”

Image sensors The goal in this work is not necessarily to exactly replicate the biological retina, but

8

integrated for a certain time and later processed,” explains Dr San Segundo Bello. This process imposes inherent limitations on the performance of the sensor. “The noise that is generated by these interactions is large. You then have a limitation in the amount of charge that you can store, so you have a limitation in the amount of detail in the scene,” continues Dr San Segundo Bello. “If you use your mobile phone or your digital camera to take a picture at very low light then you need certain settings. These settings do not allow you to mix very low light and very high light.” The difference between the low light and high light parts of an image is known as the dynamic range, and here conventional image sensors are also typically quite limited. However, the SeeBetter project’s image sensors do not work by accumulating the light generated,

but instead look continuously at the amount of light in a scene. “The sensor actually looks at the rate of change, the derivative of the amount of light. You can picture it as comparing the current amount of light to the amount of light a bit earlier, not to a fixed value. If there is a difference in the amount of light, a digital pulse indicates that a change was detected, and also if the change was towards more or less light. If there was no difference, no information is sent,” explains Dr San Segundo Bello. “There is a kind of differentiation. If you are looking at a scene where nothing is moving, then you would not get any response, because the amount of light one second before would be the same. So there is no information to send. But if something moves within the scene, or if the light changes, there is a difference between the light at that moment in a pixel and the light in it a few moments before.” This is the information that is then sent through to generate the image. This method is based on activities rather than frames, but Dr San Segundo Bello says it also has some limitations, such as in terms of bandwidth capacity. “If all the

EU Research


Development constraints

pixels have information you are going to have to compromise in which pixels you read, or you will have some delays. So there are certain applications, such as if everything is moving, where this would not really be useful, but for most applications only part of the scene changes,” he outlines. The type of sensor the project is using is capable of very high frame-rates, because it only looks at changing information; however, while the biological retina holds important insights, Dr San Segundo Bello says it is important to also take account of its limitations. “There are other kinds of image processing that would be different from what the retina does in that respect. So you have to be aware that the biological retina of animals also has its limitations,” he acknowledges. “We know that certain functions are performed very efficiently by the biological eye – we are trying to see if we can understand the way these work, so that we can do the same thing in a silicon retina.”

There are also constraints in the development of silicon retinas, particularly in pixel size, chip size and the complexity of implementing certain functionality. Researchers at Imperial College in London (ICL), another partner in SeeBetter, are looking at the information FMI has gathered on how the retina works and developing mathematical models to help the project identify key priorities. “They look at all the things that FMI has extracted from how the retina works, then look at issues like ease of implementation, usefulness and cost. Then we have to make the choice of course,” says Dr San Segundo Bello. “From these investigations, researchers at the University of Zurich can adapt, change and improve their previous image sensor designs. The team at the University of Zurich have been pioneers for years in the design of image sensors based on the concept of emulating the simplest functionality found in the biological

~4000 Events in 29 ms

Rotating Dot Stimulus

Space - Time

Juggling Event Time

~16000 Events in 57 ms

early

~8000 Events in 26 ms

late

Driving Scene

Faces

Snapshot

~80 Events in 300 us

Time

5 ms

200 rps

Y X

Highway Overpass

~1000 Events in 15 ms

~16300 Events in 300 ms

early

~7500 Events in 300 ms

late

Eye

Examples of different raw images captured with a DVS camera.

www.euresearcher.com

9


At a glance Full Project Title Seeing Better with Backside Illuminated Spatio-Temporal Silicon Retina (SEEBETTER) Project Objectives 1: To use genetic and physiological techniques to understand the functional roles of the 6 major classes of retinal ganglion cells. 2: To build a computational model of retinal vision processing from the viewpoint of biology, machine vision, and future retinal prosthetics. 3: To build the first high performance silicon retina with a heterogeneous array of pixels specialized for spatial and temporal visual processing and with a datadriven, event-based readout. 4: To build the first back-side illuminated silicon retina vision sensor Contact Details Project Coordinator, David San Segundo Bello IMEC Kapeldreef 75, B-3001 Leuven, Belgium T: +32 16 28 12 29 E: David.SanSegundoBello@imec.be W: www.seebetter.eu

David San Segundo Bello

Principle of operation of a DVS pixel. Increase in the amount of light reaching the pixel give rise to “ON” events. Similarly, decreasing light levels give rise to “OFF” events. The rate of change translates into the rate of events. No change in light level means no events are generated.” retina, so they were the obvious choice for this part of the work. Tobi Delbruck’s research group at the University of Zurich has been developing chips based on the “Dynamic Vision Sensor” (DVS) concept for almost ten years already. The principle of the operation of such a device is shown in the figure above. Examples of raw images taken with existing versions of DVS cameras are also shown in the figure.” Imec contributes by applying our back-side processing technology to the manufactured silicon sensor to improve sensor sensitivity.

The main target applications for this research are in biology, machine vision and retinal prosthetics. Several projects around the world are working on artificial retinas; if researchers can develop an artificial retina that works in a similar way to the real retina, it might be easier to apply, while Dr San Segundo Bello says there are also other applications which need this particular type of image processing, for example surveillance. “It could be used to inspect factories where objects are moving on a conveyor belt, so you can check whether the objects have a certain shape,” he outlines. Machine vision and retinal prosthetics are the two most obvious applications but Dr San Segundo Bello expects there will be others, offering significant scope for further research in any potential followup project. “We could choose one or several applications where it really would pay off to optimise the functionality that we have implemented. Then we could target the sensor specifically towards these applications, put it in the field and think of all the steps that need to be covered,” he says. “We consider this project as a first step towards further development. So in that regard, once the project has finished, if we are successful then a lot of opportunities could open up. And of course we will probably need to look for partners who have more application-specific knowledge.”

Project Coordinator

David San Segundo Bello received his Ph.D. degree from the University of Twente, Enschede, The Netherlands. Since 2008, he has been with Imec, Leuven, Belgium, and has been working mostly on the design of electronic systems and integrated circuits for image sensors and also high-resolution data converters.

10

EU Research


Self-organisation at the nanoscale Designing molecules capable of self-assembling into functional nano-objects and materials is a complex challenge. We spoke to Professor Nicolas Giuseppone of the SELFCHEM project about driving forces behind efforts to design smart self-assembled systems with specific new functionalities Designing molecules able

to selfassemble into functional new materials and nano-objects is one of the greatest challenges facing physics, chemistry and bio-materials science researchers. Part of this work involves developing materials which can respond to their environment; however, this may not in itself be sufficient to achieve really smart properties. “There is a lack of understanding over which systems are responsive and which can truly evolve to produce advanced functions,” explains Professor Nicolas Giuseppone, the coordinator of the SELFCHEM project, an ERC-funded initiative addressing some of the many questions around selforganisation processes, hierarchical organisation and materials science. Professor Giuseppone says developing smart systems is a key part of this work. “In addition to sensing their environment, smart systems should be able to adapt to it, possibly by changing their constitution in order to become more efficient under specific conditions or to multitask,” he continues. “This could concern two distinct kinds of functional systems acting either at thermodynamic equilibrium as molecular self-assemblies, or far from equilibrium, as living systems do.” The project’s specific goals focus on the transfer of chemical or physical information through the self-organisation of supramolecular carriers. Self-organisation is a central part of the evolutionary process proposed by Darwin; Professor Giuseppone is asking the same kinds of questions as the famous naturalist, but this time with regard to self-organisation at the nanoscale. “Are there rules which sustain a sort of molecular Darwinism and can we take advantage of them to design new adaptive materials? If yes, one may expect that such

www.euresearcher.com

advanced artificial systems will display several features which are present in living systems. In particular, these new materials should ultimately combine three key properties of life, which are the abilities to metabolize, mutate and self-replicate as a whole system,” he says. “This is indeed a

systemic approach to responsive materials. If we can get a better understanding and control over the basis of self-organisation, we will then be able to take advantage of it for the design of ‘information-gaining’ systems, producing new functions by themselves.”

Figure 1. Light-triggered self-construction of supramolecular organic nanowires as metallic interconnects. A: nucleation of the fibers by supramolecular associations of a modified triarylamine upon a flash of light. B: AFM image of the self-assembled fiber (© Wiley-VCH Verlag GmbH & co. KGaA ). C: directed growth and insertion of the nanometric fibers driven by the electric field within electrodes. D: AFM image of the fibers conducting the electric information between the electrodes. (© Nature Publishing Group). Related publications: Angewandte Chemie International Edition 2010, 49, 6974; Nature Chemistry 2012, 4, 485-490; and Advanced Materials 2012 DOI: 10.1002/adma.201201949.

11


Figure 2. Transfer of controlled mechanical motions from the molecular level to higher length scales. Inspired by the functioning of muscular cells, thousands of artificial molecular machines capable of 1 nm scale contractions and extensions were coordinated together, in space and time, by using supramolecular chemistry. The integrated motion of the resulting polymer chain was amplified by four orders of magnitudes. Middle: Molecular modeling of three molecular machines linked within the supramolecular polymer chain. © Wiley-VCH Verlag GmbH & co. KGaA. Related publication: Angewandte Chemie International Edition 2012, DOI 10.1002/ANIE.201206571.

Self-organisation The project is investigating the duplication of chemical information, its transfer and conversion, and the transport of physical information to improve the understanding of the basis of self-organisation. While all natural systems self-organise, it is not currently known whether they self-organise with common rules. “If we can understand this, we will then be able to design smarter materials. For instance, we would like to construct complex systems starting from very simple building blocks because they will be able to use the external energy to improve their organisation while dissipating this energy,” says Professor Giuseppone. Self-organisation is an emerging property in itself, and Professor Giuseppone believes it could also give rise to new functional properties in materials science. “For any kind of functional material, you can expect to switch on and off – or modulate – its properties by recombination and selforganisation processes,” he says. “But life has proven to go further and to create new

12

‘bio’ functions from scratch during the evolutionary process. Chemistry can in principle create an unlimited diversity of molecules and self-assemblies.” From here there is no limit to new systems that could potentially be created in terms of structures. However, researchers first need to understand the basis of selforganisation in greater detail. “Living systems always keep the majority of their constituents when making copies of themselves, although they can alter part of their genes, which generates new proteins and metabolites which can improve their ability to adapt to their environment,” explains Professor Giuseppone. “This diversity helps living systems explore new and more efficient phenotypes. This can be seen the same way at the molecular and supramolecular levels, where libraries of components can be generated until one fits better to its environment. The use of reversibility in generating this diversity is a key element, because reversibility of selfassemblies in general provides systems

with new functionalities such as errorchecking and self-healing properties which are necessary for adaptation.”

Hierarchical organisation The hierarchical organisation in systems chemistry is also key to these targeted functionalities. Several layers of structures and dynamics are involved in networks, with each playing a major role in the system as a whole. “Self-assemblies can be seen as molecules linked by supramolecular bonds. These products can all be exchanged by chemical reactions which can define a ‘reaction network’ which also gives rise to complexity. Depending on the topology of these networks, systems can become more robust, which means that they can survive even if one or more of their constituents fail,” says Professor Giuseppone. These networks can all communicate with each other, crossing the different length scales and affecting different levels of the system. “An event

EU Research


At a glance Full Project Title SELFCHEM Project Objectives The objectives of the SelfChem project are oriented towards a better understanding of complex systems, self-organisation, and emergence of order from chaos. It aims at understanding multi-component chemical systems, how the circulation of information can be the driving force for the formation of functional self-assembled nano-structures and reaction networks.

at the molecular level can affect the supramolecular one and then the reaction network. This can be reversed and by perturbing the reaction network, you can produce new molecules, paving the way to adaptation and creating communication pathways through space and time. The transfer of information between these objects is a key part of our project,” continues Professor Giuseppone. By understanding the structure and basis of self-organisation researchers could then use it to develop new functional materials or nano-objects. Professor Giuseppone says there are two different approaches. “Imagine that you want to create an inhibitor for a given enzyme. With the classical sequential

Synthesising nanowire Using these fundamental principles, the Giuseppone group has shown that it is possible to synthesize a new class of nanowires, which conducts electricity with extraordinary efficiency, by simply placing its building blocks in an electric field, for example. Moreover, this selected supramolecular structure is spontaneously self-organised exactly between the electrode at a scale of 100 nm, demonstrating that it is possible to obtain the right material for the right function, at the right moment and at the right place. “By conferring such properties on these kinds of systems, chemists will change their way of

In addition to sensing their environment, smart systems should be able to adapt to it, possibly by changing their constitution in order to become more efficient under specific conditions or to multitask approach you can synthesize 10,000 molecules separately, purify each of them, and test them against the target,” he explains. “With the systems approach, for instance in a dynamic combinatorial library, we proceed in a single step: let’s mix the building blocks which exchange under reversible conditions to produce a pool of dynamic inhibitors; then, mix your library directly with the enzyme and see if one product is spontaneously amplified because it binds to its target more strongly. This is not only true for binding events, we can also go further by considering more general changes in environmental conditions which could affect the formation of a given product.”

www.euresearcher.com

thinking, going from directed synthesis to self-construction. In order words, the objective is to let the system explore the energetic landscape itself, to find the combination best suited for a given purpose,” explains Professor Giuseppone. The project has established links with the commercial sector, and while the project’s work is still at an early stage, Professor Giuseppone believes their research could be relevant to several areas of industry. “Long term benefits could be expected in the domains of selfhealing materials and self-repairing devices, smart drug carriers and nanotechnologies to precisely control bottom-up fabrication,” he says.

Contact Details Project Coordinator, Professor Nicolas Giuseppone Faculty of Chemistry, University of Strasbourg, SAMS Research Group - icFRC, Institut Charles Sadron 23 rue du Loess, BP84047 67034 Strasbourg Cedex 2 France T: +33(0)3 88 41 41 66 F: +33(0)3 88 41 40 99 E: giuseppone@unistra.fr W: www-ics.u-strasbg.fr/sams • Giuseppone, N., Towards Self-constructing Materials: A Systems Chemistry Approach Acc. Chem. Res. 2012, advanced online publication (doi: 10.1021/ar2002655). • Moulin, E., Cormos, G., Giuseppone, N., Dynamic Combinatorial Chemistry as a Tool for the Design of Functional Materials and Devices, Chem. Soc. Rev. *2012*, 41, 1031.

Professor Nicolas Giuseppone

Project Coordinator

Professor Nicolas Giuseppone is deputy director of the Institute Charles Sadron. He became Full Professor of Chemistry in 2007 at the University of Strasbourg, after which he created the SAMS research group at the ICS. His current research interests focus on Supramolecular Chemistry and Systems Chemistry, more specifically for their implementation in soft-matter science.

13


Tailored tools to protect vulnerable patients We all expect high quality treatment from healthcare systems, but providing adequate protection for vulnerable people remains a major challenge. We spoke to Professor Samia Hurst of the University of Geneva about her work to develop a unified definition of vulnerability for use by healthcare professionals It is generally

agreed that vulnerable people deserve special protection or attention from our healthcare systems, but applying these protections remains a major theoretical and practical challenge. Based at the University of Geneva’s Institute for Biomedical Ethics, Professor Samia Hurst is coordinator of the Protecting Vulnerable Persons in Healthcare project, which aims to develop decision-making processes which will help protect vulnerable people in healthcare systems. “We started with two puzzles. First: there is broad consensus that vulnerable persons deserve greater attention or protection, but little consensus or even clarity on who the vulnerable are and what protections may be adequate. Second: we generally agree that fairness is important in healthcare, but agreeing on what fairness demands in this context is notoriously difficult,” she says. Vulnerability is part of the human condition to some extent, but some people are comparatively more likely to incur unjustified manifestations of vulnerability. “They are more likely to have their claims disregarded and thus to encounter wrongs,” continues Professor Hurst. “Applied to health care, this means that we can consider particularly vulnerable those who encounter an increased likelihood not to have their claims taken into just consideration.”

Vulnerability The first step in the project is to develop a clear definition of vulnerability that can be used by healthcare professionals and decision-makers, as well as patients and patient advocacy groups. The scope of this work is broad, bringing together researchers from different branches of medicine and philosophy to develop a unified definition that integrates all the

14

various forms of vulnerability. “We are doing conceptual work to provide a definition of vulnerability that works. Once you have done that you can identify vulnerable populations, in part through existing data in the literature on, for example, the psychology of decisionmaking,” outlines Professor Hurst. Researchers are drawing from the sciences to understand why the claims of vulnerable people might not be considered as fairly as the claims of the less vulnerable. This difference in the evaluation of claims may often not be intentional on the part of healthcare professionals, but this fact does

everyone else,” says Professor Hurst. The project aims to help doctors assess each individual patient and tailor their interventions accordingly; Professor Hurst points to efforts to encourage patients to stop smoking as an example of how successful a tailored approach can be. “If you try and convince your patient to stop smoking there are conversations that are more or less scripted to help you do that, and also to adapt to where that person is with cigarettes, whether they have already contemplated stopping smoking, whether they’ve already tried, how willing they are and so on,” she

We are doing conceptual work to provide a definition of vulnerability that works. Once you have done that you can identify vulnerable populations, in part through existing data in the literature on, for example, the psychology of decision-making not make it less damaging to those who are vulnerable. “One hypothesis is that things like in-group bias – the fact that we tend to be more favourably inclined to people who are more like us – might have an impact on clinical decision-making,” says Professor Hurst. “This might actually explain part of what has been described in emergency situations, where there are disparities in the way in which people from different ethnic groups are treated.” Protecting these patients means ensuring their interests are taken into account in the same way as less vulnerable parts of the population. This does not necessarily mean treating everybody in exactly the same way, but rather moving towards tailored interventions that take the patient’s personal situation into account. “In some cases it means you will have to expend more effort for vulnerable patients simply to give them the same as

explains. “These conversations have been tested experimentally and they work. So if you take five minutes to try and convince your patient to stop smoking as a GP, a proportion of your patients will stop smoking.” Effective communication is crucial to the success of such interventions, and they need to be tailored further for some populations: this tends to be easier for doctors with patients of a similar educational background, as they are more likely to use similar language and be able to establish a personal rapport. Where that isn’t the case, these interventions work less well. Sometimes there are even bigger hurdles to overcome; migrant workers for example might be unfamiliar with how the health system works, not know the language, and have a precarious work situation, an accumulation of vulnerabilities that is greater than the

EU Research


www.euresearcher.com

15


At a glance Full Project Title Protecting Vulnerable Persons in Health Care (ENABLE) Project Objectives The aims of this project are to A) outline how vulnerability in health care can be defined consistently; B) explore how claims for protection can be weighed against one another in scenarios of deep scarcity; C) describe characteristics identified by the neurosciences which could affect clinician bias and, through it, vulnerability in clinical care. Project Funding This project is supported by a grant from the Swiss National Science Foundation (PP00P3_123340) Contact Details Project Coordinator, Professor Samia Hurst Institute for Biomedical Ethics, CMU/1 rue Michel Servet, 1211 Genève 4 T: +41 22 379 46 01 E: samia.hurst@unige.ch W: http://www.unige.ch/medecine/ib/ ethiqueBiomedicale/recherche/enable.html Hurst SA: Ethique et santé publique <Ethics and public health>. Les Ateliers de l’Ethique. 2012. 7(3):59-67 Tavaglione N., Hurst SA. : Why physicians ought to lie for their patients. American Journal of Bioethics. 2012;12(3):4-12 Hurst SA: Ethics and non-biomedical research with human subjects. Swiss Medical Weekly. 2011, 141, w13285 Hurst S.: Vulnerability in Research and Health Care; Describing the Elephant in the Room? Bioethics. 2008, 22 (4), pp 191-202

Left to right: Samia Hurst MD, Chloë FitzGerald, Nicolas Tavaglione and Angela Martin

Samia Hurst MD directs the project. She is SNF professor of Bioethics at Geneva University’s medical school in Switzerland, ethics consultant to the Geneva University Hospitals’ clinical ethics committee, and member of several ethics committees including the research Ethics Review Committee at the World Health Organization. Chloë FitzGerald, postdoctoral fellow, philosophy. Nicolas Tavaglione, senior research fellow, political philosophy Angela Martin, PhD student, philosophy.

16

original issue. “We aim to provide a modular definition of vulnerability, where different vulnerabilities add up or even have an effect on one another, as some of them actually increase other vulnerabilities. Then it becomes easier to measure the presence of such vulnerabilities in individuals, and to compare groups with or without them in public health. We are hoping to provide not just doctors, but also researchers, with better tools for that,” says Professor Hurst. These are key issues in terms of the fairness of healthcare systems, which is the subject of ongoing debate. “There is consensus on the need to protect vulnerable people, much more than there is on what it means to have a healthcare system that is as fair as possible,” continues Professor Hurst. Resource allocation is of course a central issue in these terms. However, while healthcare systems vary widely, Professor Hurst says vulnerability is not directly related to the availability of resources. “Vulnerable persons will still be vulnerable whether or not there is a high degree of resources in the health system,” she stresses. Protecting vulnerable people becomes harder when resources are scarce, but this doesn’t mean the definition itself changes, rather that efforts allocated to protecting the vulnerable need to increase. Professor Hurst describes a hypothetical situation where two patients need to be educated about self-care for diabetes; one is fluent in the local language, the other has communication problems. “You want both patients to be aware of the precautions they must take when they leave your consultation. They need to be aware of how to test their glycaemic level, how to inject insulin and so on,” she outlines. “In the end you can achieve the same thing for both. But in the case of the one that doesn’t speak the language very well it might take you

more time – you will need to tailor your message differently, you will maybe need additional help, possibly a translator and so on. That is the additional effort that this form of vulnerability will require.”

Areas for improvement A clearer definition of vulnerability could not only enable more individualised treatment of patients, but also help researchers identify areas of healthcare systems which require improvement. However, vulnerability can take many forms, and the project’s definition does not label specific persons as particularly vulnerable. “Since there are many circumstances that can cause vulnerability, we are basically all at risk of being particularly vulnerable in one way or another at some point of our lives,” explains Professor Hurst. Educating clinicians and healthcare professionals about vulnerability and providing them with tools to minimise bias is crucial to protecting people in these circumstances, and is a central part of the project’s goals. “I’m hoping that the major outputs of the project will be a set of elements that we can include in teaching doctors and medical students, which will help them protect vulnerable persons better and make them more aware of vulnerability as a dimension in clinical care, so that they can reason through this in the same way they reason through other aspects of clinical care,” outlines Professor Hurst. “I’m also hoping we can produce results that will bring forward the question of resource allocation.” “I’m not claiming that we will solve this issue, but I’m hoping that we will take it some steps forward. Finally, I’m also hoping that we will provide tools that help public health researchers measure how we’re doing with vulnerable populations better than is the case currently.”

EU Research


Building on the building blocks of life

It is increasingly thought that bottom-up approaches, using self-assembly principles, are the best approach to synthesising functional nanomaterials. These materials could help researchers address some of the most pressing challenges facing society, as Professor Rein Ulijn of the EMERGE project explains

The ability of biological systems to blocks, the amino acids, are largely Mimicking processes The project is essentially trying to mimic evolve and adapt to new situations is of responsible for molecular processes across these three chemical processes in mangreat interest to researchers, who see it as all life forms.” These building blocks are combined in made materials by using those same being relevant to many of the different ways to produce larger building blocks of life, but in a technological challenges we face, molecules called peptides, which leads to minimalistic way. Researchers are using including those in healthcare and energy. the development of the key processes of simple biomolecules, much simpler than The EMERGE project is working to capture catalysis, molecular self-assembly and those found in nature, to make basic model certain biological systems and replicate molecular recognition in natural systems of what happens in biology. “We their properties. “We have set ourselves biological systems. Combining peptides aim to combine catalysis, self-assembly the challenge to basically minimise or with different functions then boosts and molecular recognition to achieve simplify, as much as possible, what we see these systems’ ability to move, to respond certain functions,” says Professor Ulijn. in biology, and then produce man-made Self-assembly uses a bottom-up materials based on this approach which starts on the minimalistic biology,” molecular level. “We design the explains Professor Rein Ulijn, peptide molecules, based on the the project’s scientific amino acid toolbox, and they coordinator. There are come together to form larger, arguably only three molecular nano-scale objects. These processes in biological ultimately can form networks that systems; these processes are in turn form gel-like structures an important focus area for that are macroscopically the project. “One is known as observable,” explains Professor molecular recognition, the Ulijn. “Basically when you add ability of two molecules to these molecules to water, they recognise and bind to each spontaneously organise other selectively based on themselves to maximise attractive molecular level interactions that are built into the complementarity between the peptides, this results in the ‘host’ and ‘guest’ molecules ,” Self Assembled Gel: Atomic force microscopy image and formation of larger structures outlines Professor Ulijn. proposed molecular structure of a self-assembled peptide gel such as fibres. These fibres may “Another is known as (image produced by Dr Robert Mart, University of Cardiff, UK). become entangled and then molecular self-assembly, the immobilise the water, and the ability of many molecules to and adapt to their environment, and water starts to look like a gel phase which come spontaneously together to form a ultimately their ability to evolve. “This expresses the properties of the molecular structural material or object. The third has helped tissues to heal and regenerate building blocks. So this is about design at molecular process is catalysis. In and plants to develop the ability to the molecular level, but it translates to the biological systems, these three molecular convert sunlight into energy, for nano, to the micro level and to the processes are based on just 20 chemical example,” explains Professor Ulijn. macroscopic level.” building blocks. These same 20 building www.euresearcher.com

17


We have set ourselves the challenge to

basically minimise or simplify, as much as possible, what we see in biology, and then

produce man-made materials based on this minimalistic biology A sound understanding of how self-assembly processes work is crucial to the project’s work. Professor Ulijn and his colleagues are studying self-assembly processes to try and derive some general rules which can be used in developing new materials. “We are studying how self-assembly occurs and how it can be directed to drive and control the selfassembly process better. Biology is very good at this control, but as chemists we still have a lot to learn,” he says. The second key process, molecular recognition, relies on the elucidation of specific patterns of amino acids that bind with high selectivity. “There’s actually a whole battery of different possible molecular patterns that are known, that we can use to introduce selective binding into our nanostructures,” continues Professor Ulijn. “These patterns are often

18

orthogonal, in that you should be able to form different self-assembled structures in one pot.” The most challenging of the three molecular processes, in terms of creating minimalistic mimics of biological systems, is catalysis. Molecular catalysis in biology is performed by complex protein molecules, known as enzymes. “When studying catalytically driven assembly in the laboratory we currently use biological enzymes, as scientists are not yet able to produce simpler mimics of these systems,” says Professor Ulijn. Using enzymes in this way has enabled the team to dramatically enhance the control of self-assembly processes, however they are keen to take enzymes out of the equation and produce minimal mimics of these. “This is our ten-year challenge. We are developing new

screening methodology to identify minimal catalytic systems, and we have identified the first examples. They do not work very well yet but we are hopeful that we will be able to produce efficient catalysts in the future.”

From structure to function One area where self-assembled systems are showing promise in terms of realworld applications is their use as scaffolds for the culturing of cells in the laboratory. Cells are currently often cultured on plastic surfaces, which is a poor mimic of biological tissues, hence there is a need for more realistic models. It is known that the chemical composition and stiffness of the gels is critical to obtaining the desired cell response. “We’ve been able to control the self-assembly process, and as a result control the stiffness of gels that we make.

EU Research


At a glance Full Project Title Enzyme-driven MolEculaR nanosystEms (EMERgE)

Catalytic Self-Assembly of Peptide Nanostructures: Schematic representation of formation of nanoscale fibres by a process of enzymatically triggered self-assembly (image produced by Dr Robert Mart, University of Cardiff, UK).

The stiffness of these gels can mimic the stiffness of different tissues in the body,” says Professor Ulijn. “We’ve studied stem cell responses to these gels together with collaborators at the University of Glasgow, and found that they differentiate according to the stiffness. On a weaker material we observe elements of differentiation to nerve-like cells, because the weak material mimics the stiffness of brain tissue. While on the stiffer material, that mimics the stiffness of cartilage, and we get differentiation towards more bonelike, cartilage-like cells.” A spin-off company, biogelX Ltd., has been established to capitalise on the commercial potential of these materials, while there are other potential applications in food and cosmetics. The project is also looking at the unusual electronic interactions that may occur with some of these gel-phase materials, which could lead to other applications. “We can design our building blocks in a way that they contain non-biological, synthetic ligands that are able to conduct electrons. Based on this concept we can actually make small conducting wires of these systems which may find uses in areas where biological systems need to interface with electronics,” says Professor Ulijn. “For example, if you want to interface electronics with

www.euresearcher.com

biology, to have a soft, adaptive interface that has some of the properties of the biological system and some of the electronic, man-made system, then our materials would be very useful to produce an effective interface between the two.” This research is very much interdisciplinary in scope. With the project’s work centring on these precisely defined nanostructures, Professor Ulijn says he and his colleagues need to combine many different techniques to access the relevant information. “It’s like a big jigsaw puzzle, where you use many different spectroscopic, materials characterisation and microscopy techniques, and increasingly also modelling,” he explains. “We would like to be able to rationally design and predict structures, which we’re still not very good at. We’re working with computational chemists and modellers to try to get to a better predictive ability. Our main challenge is that often we’re still not able to rationally predict what will happen. We hope to be in a situation where we would first go through a modelling phase, and try to decide whether that would be the best chance of success, and then start our experiments rather than the other way round. Currently we use modelling to explain what we’re doing – we would like to move on to a predictive, hypothesisdriven approach.”

Project Objectives The proposed work aims to explore the concept of minimalistic biology- using simplified versions of biological molecules and processes to produce materials with unprecedented properties such as adaption, motility and self-repair. The work thereby address challenges that are currently at the frontier of knowledge in the area of molecular self-assembly. In addition, we will work towards exploitation of our findings in the development of new materials for biomedical and energy applications. Contact Details Project Coordinator, Professor Rein Ulijn WestCHEM Chair and Vice Dean Research (Science) University of Strathclyde T: +44 141 548 2110 E: rein.ulijn@strath.ac.uk W: www.ulijnlab.com

Professor Rein Ulijn

Project Coordinator

Rein Ulijn currently runs an active research group consisting of 25 active researchers. His is WestCHEM Professor of Biomolecular Nanotechnology, Vice Dean Research for the Science Faculty and CSO of biogelX Ltd. He gained his MSc in Biotechnology from Wageningen University and PhD in Physical Chemistry from the University of Strathclyde.

19


Understanding chronic inflammation The link between chronic inflammation and carcinogenesis is well known, but there is still much to learn about the underlying mechanisms. We spoke to Professor Mathias Heikenwälder about his research into the mediators that induce chronic inflammation, and how it then develops into liver cancer

Chronic inflammation The initial cause of inflammation could be the presence of an antigen which the body’s immune system recognises as foreign, leading to persistent problems such as tissue damage, inflammation and

20

Collagen IV

Ki-67

C57BL/6

H/E

AlbLTαβ

The causal relationship between chronic inflammation and carcinogenesis is well established, but there is still much to learn about the underlying mechanisms behind the induction of inflammation and how it develops into cancer. This area forms the primary focus of the LiverCancerMech project’s research, which is funded by the ERC. “We aim to understand how chronic inflammation develops and establishes itself, the mediators that induce chronic inflammation and then how inflammation turns into liver cancer,” says Professor Mathias Heikenwälder, the project’s scientific coordinator. Chronic inflammation does not invariably lead to liver cancer but it is a high risk factor; Professor Heikenwälder and his colleagues are using mice as well as human tissue and patient data to study the condition. “We generated a mouse model using two cytokines that are highly up-regulated in patients chronically infected with Hepatitis B and C virus. We expressed them in the liver of a transgenic mouse to address the question of whether these cytokines are causally linked to the development of inflammation induced cancer. What we generated were indeed mice that developed chronic inflammation,” he explains. “In somewhere between 20-30 per cent of the mice, this chronic inflammation then turned into cancer. So that was the starting point as an optimal model to investigate liver inflammation-induced carcinogenesis.”

This figure shows the histological analysis of a hepatocellular carcinoma in AlbLTab transgenic mice. (Left) Hematoxilin Eosin staining to depict the liver architecture, displaying a tumour (round structure) in AlbLTab transgenic mice. (Middle) collagen 4 shows the widening of collagen 4 fibers when compared to controls indicating the abberant growth of hepatocytes. (Right) Ki67 depicts the proliferation of hepatocytes that are not proliferating in wild-type mice. compensatory proliferation in cases where the antigen persists for a long time. The body’s immune response to the antigen has the effect of damaging tissue, which leads to further inflammation. “This induces, again, more activation of immune cells, because immune cells will come to eat up – phagocytose – the damaged hepatocytes, they will get further activated and induce further compensatory proliferation – sustaining a vicious circle,” says Professor Heikenwälder. This compensatory proliferation is one of the keys to how inflammation induces cancer, but Professor Heikenwälder says it is not a case of simply following a chain of events. “If it was a particular sequence of events that you could easily follow on the molecular level, then you would see that every Hepatocellular carcinoma (HCC) would have the same chromosomal aberrations. The normal end product of this chronic

inflammation would be cirrhosis, fibrosis, and then HCC, but there are exceptions,” he explains. “Not all HCCs are the same, we have many sub-types. One of the problems with this disease – which accounts for 800,000 deaths a year, the third highest number of cancer-related deaths worldwide – is that it is so heterogenous, not only between patients, even in one and the same patient.” But some patients may have several different sub-types of HCC in their liver, which can be a major problem in terms of treatment. There are also different types of inflammation to consider. “There can be inflammation that is more aggressive, where you have particular polarised macrophages in the liver. Then there are inflammatory reactions where you have less macrophages but more lymphocytes,” outlines Professor Heikenwälder. Two of the major liver cancer risk factors are the

EU Research


Hepatitis B and C viruses; new research has recently emerged in this area. “New data indicates that when you express the whole genome of a virus in the liver of a mouse, you can induce liver cancer with a very long latency – it takes two years – even in the absence of overt lymphocytic inflammation,” says Professor Heikenwälder. “This clearly indicates that the viral proteins have some oncogenic potential themselves. But it’s believed that the viral infection is very persistent. That matters in this particular case – the persistence of the virus infection drives an inflammatory reaction that then turns on the whole cascade.” Researchers can look at the early stages of this process through analysis of tissue from mice. It is thought that lymphocytes play an important role in driving the inflammatory process, while there is a signalling pathway in hepatocytes that drives liver cancer. “This indicates that

www.euresearcher.com

maybe the starting point of inflammation is actually the hepatocyte, that it somehow induces inflammation. Various techniques have shown that, in human patients infected with Hepatitis B and C viruses, it is actually the hepatocytes at the very beginning that start to express chemokines and cytokines. So these inflammatory molecules attract immune cells and that inflammation is actually needed to some extent for the virus to replicate efficiently,” says Professor Heikenwälder. Chronic inflammation also induces the killing of hepatocytes – and that in turn causes compensation, which leads on to further problems. “This causes fibrosis, it causes hepatocytes to proliferate more often than they actually want to,” continues Professor Heikenwälder. “Hepatocytes are usually in the dormant cell cycle state – called G0. They don’t like it if they are pushed into a cell cycle where they have to go through various other phases.”

These problems do not occur in acute inflammation, as the initial cause of the problem is resolved; the antigen is not there any more, the cytokines are not expressed and the hepatocytes can regenerate. However, with chronic inflammation, lymphoid-tissue like structures start to develop. “They present antigens, get activated, colonially expand and then leave again in the circulation to produce antibodies. These chronic inflammatory structures look almost identical, or certainly very similar, to what you can find in the secondary lymphoid tissue,” says Professor Heikenwälder. Chronic inflammation can also have an impact on other organs; Professor Heikenwälder points to the example of auto-immune pancreatitis. “This is a chronic inflammatory auto-immune reaction in the pancreas, it also forms another chronic inflammatory structure. As a consequence of this, high amounts of antibodies are secreted into the serum, and

21


At a glance

C57BL/6 (18 mo)

Full Project Title The mechanisms of inflammation driven liver cancer (Liver Cancer Mechanisms)

AlbLTαβ (12 mo)

AlbLTab (18 mo)

Project Objectives The objective of the project is to understand how inflammation drives cancer. We try to recapitulate chronic hepatitis in mouse models (as in human patients infected with Hepatitis viruses) to investigate inflammation induced liver cancer. We thereby plan to identify new molecules and cells involved in inflammation driven cancer. Project Funding Funded by an ERC starting grant. Contact Details Project Coordinator, Professor Mathias Heikenwälder, Ph.D. Institute of Virology, TU Munich Schneckenburgerstr. 8 81675 München T: +49 89 4140-7440 E: heikenwaelder@helmholtzmuenchen.de W: http://www.helmholtz.de/ forschung/eu_projekte/ideen/erc_ starting_grants/livercancermechanism • Heikenwalder M, et al: Lymphoid follicle destruction and immunosuppression after repeated CpG oligodeoxynucleotide administration. Nat Med 10:187-92, 2004 • Haybaeck J, et al: A lymphotoxin-driven pathway to hepatocellular carcinoma. Cancer Cell 16:295-308, 2009 • Wolf MJ, et al: Endothelial CCR2 signaling induced by colon carcinoma cells enables extravasation via the JAK2-Stat5 and p38MAPK pathway. Cancer Cell 2012, July in press. [Epub ahead of print] • Seleznik GM, et al: Lymphotoxin b Receptor Signaling Pormotes Development of Autoimmune Pancreatitis. Gastroenterology 2012 Aug 2 [Epub ahead of print]

Professor Mathias Heikenwälder Project Coordinator

Mathias Heikenwälder studied Microbiology and Genetics at the University of Vienna and the Institute of Molecular Pathology (IMP). After his ‘masters’ thesis in Berlin at the Max-Delbrück Center for Molecular Medicine (MDC) and his Ph.D. and postdoc in Zürich at the University of Zürich, he became an independent group leader at the University hospital Zürich (UZH). Since 2010 he has been Professor at the Department of Virology, Technische Universität München (TUM) and the Helmholtz-Zentrum München.

22

A Macroscopic analysis of liver tumours in AlbLTab transgenic mice. Control mice (Left) do not develop liver cancer, (Middle) transgenic mice first develop liver cancer tumours at 12 months of age (Middle). At later time points whole liver lobes are affected displaying large hepatocellular carcinomas (Right). these antibodies induce glomerulonephritis,” he explains. “They get deposited in the filtering system of the kidney and clog the kidney, and these patients also get an inflammatory disorder in the lung, most likely through cytokine signalling.” This reinforces the wider importance of research into chronic inflammation, which can be the cause of persistent tissue damage, which in turn is the basis of cancer development. The goal of improving treatment is made more complex by the ability of aberrant hepatocytes to evade the immune system. “In the liver, the hepatocytes and other

to other therapies. This is because inflammation can be protective for the tumour cells.”

Treatment The wider goal is to use this knowledge to improve treatment. The regimens the project is using are either in clinical trials, or already have been, for treatment of other inflammatory diseases. “One aim of our study is to see whether we can reproduce and define new biological principles in mice. The next step would be to apply for phase-1 clinical trials, where we look at safety,” says Professor Heikenwälder. However, currently

We aim to understand how chronic inflammation develops, the mediators that induce chronic inflammation and then how inflammation turns

into liver cancer

cells somehow have found a strategy to fool the immune system not to detect those aberrant cells as aberrant,” explains Professor Heikenwälder. Transferring T-cells able to recognise those aberrant cells into patients could improve treatment; another method is reducing chronic inflammation in the early stages, but Professor Heikenwalder says it is difficult to recognise it at this point. “That’s why one has to go from the other way round and ask; at what point can I block the inflammatory environment to prevent the development of cancer?” he says. “I’m not sure whether we will be able to treat an already established HCC with anti-inflammatory therapies, but it could be that with an anti-inflammatory therapy you could somehow make the HCC more accessible

there is no truly effective means of treating liver cancer, only palliative or inefficient treatment is available, an area Professor Heikenwälder plans to work on in future. “Can we understand what signalling events are important for the link between inflammation and cancer? And how do our models reflect the human situation?” he asks. “One of the big problems that we have at the moment in research into liver cancer is the fact that we don’t fully understand what human sub-types we reproduce in our mouse model systems. That is one of the reasons why, so far, not many drugs have been put on the market – since they were either tested in the wrong mouse models or the mouse models where particular drugs have been successful at first simply have not reflected any of the human situations, and therefore failed in clinical trials.”

EU Research


Investigating the structure and regulation of chromatin

The SMINSULATOR project uses single molecule biophysics and quantitative modelling to investigate the structure and regulation of chromatin, the combination of DNA and proteins that forms the nucleus of a cell. The long-range interactions that occur between different parts of chromosomes are of particular interest, as Dr Marcelo Nollmann explains

Eukaryotic

chromosomes

are

condensed into several hierarchical levels of complexity: DNA is wrapped around core histones to form nucleosomes, which form a higher-order structure called chromatin, and chromatin is subsequently organised by long-range contacts. The SMINSULATOR project, an EU-funded initiative based at INSERM in Montpellier, investigates the role of long-range interactions in nuclear organisation. “The main aim of our project is to investigate how long-range interactions in chromatin can lead to preferential organisation of the chromosome and transcription regulation,” outlines Dr

www.euresearcher.com

Marcelo Nollmann, the project’s coordinator. Transcription is the process by which the information encoded in the genome is converted into RNA, the template required to produce proteins. The specific transcription profile of a cell ultimately determines its identity. “Chromatin organises into euchromatin and heterochromatin, which display different physical and molecular characteristics that are key for the regulation of fundamental cellular processes, most notably of transcription,” explains Dr Nollmann. “Genes are transcriptionally silenced in heterochromatin (or ‘closed’ chromatin),

which is thought to possess a highly compact structure. Conversely, euchromatin (or ‘open’ chromatin) is defined as less condensed chromatin where most gene expression takes place during interphase. We are interested in identifying the protein network required to partition chromatin into the euchromatin and heterochromatin domains, and understanding how this partitioning leads to the definition of a cell’s identity through its transcription profile.”One of the major areas of investigation is insulator factors, a class of regulatory proteins which set up boundaries between heterochromatin and euchromatin.

23


Studying transcription regulation one molecule at a time The project’s specific focus is on eukaryotic chromosomes, distinguished from prokaryotes by the presence of a nuclear compartment, within which genetic material is stored. Previously, it was thought that eukaryotic chromosomes were organised into chromatin, and that prokaryotes are not, but recent evidence suggests that may not be the case. “Several bacterial proteins have been identified that may play a role similar to that of nucleosomes,” says Dr Nollmann. The problem is that relatively little is known about those factors and the way they work; Dr Nollmann and his colleagues are developing novel approaches, involving the detection of single molecules, to try to determine the roles of insulator proteins in chromatin organisation. “Mechanistically, we can see several parallels between eukaryotes and prokaryotes: they are both organised in chromatin-like structures that play a very important role in the expression of genes,” he says. Researchers are using two kinds of single molecule approaches to investigate these structures. The first involves a bottom-up approach to try to reconstruct the basic functions of insulator proteins in the cell. In this approach, each specific protein is purified in vitro and its activity is reconstituted. “This approach allows us to considerably simplify the system, a step usually key to asking highly mechanistic questions. Single-molecule studies, in addition, have the added advantage that they allow for the direct measurements of dynamics and sample heterogeneities otherwise obscured by ensemble averaging when using conventional methods. One example is the study of what specific proteins are required in the establishment of the long-range DNA-DNA interactions that we think happen in vivo,” explains Dr Nollmann. In the second approach, researchers

24

enhancer

RNAPII

remodelling factors

promoter

cohesin

NELF

nuclesomes

IBP

Speculative network of factors leading to long-range DNA interactions between insulators. can directly test whether long-range interactions observed in the test tube also occur in the cell. “In this case, we develop methods that allow us to detect the position and movement of single protein molecules in live cells,” says Dr Nollmann. “These methods do not always provide such detailed insights, as the environment in the cell is much more complex and less controlled, but they are still key as they allow us to directly test the possible differences of protein actions when they are in their real cellular environment.” Dr Nollmann says insulators play a vitally important function. “You need to separate the chromatin regions that are heavily expressed in some cell types and not in others, because that essentially is

going to give the identity of the cell, and the outlook of proteins that are expressed in that cell,” he stresses. Several mechanisms are required to enforce the boundary between euchromatin and heterochromatin and change it when needed, such as if certain cell types are developing. “They may need to change the proteins that are expressed, so they need to go in to some regions of the chromatin and change it, for example from heterochromatin to euchromatin. So they need access to those boundaries, and the proteins that participate in them,” explains Dr Nollmann. “In that case it may generate a spread of euchromatin into heterochromatin regions, which may lead to the expression of genes in those heterochromatic regions.”

Quantitative models Genome-wide methodologies are also very powerful as they allow researchers to measure genomic binding sites of proteins involved in chromatin organisation, and to detect the existence of long-range interactions. These methods are limited however, in that they provide information about a large, mixed population of cells that may have different binding profiles or chromatin conformations and interactions of/between different proteins.

Single-molecule super-resolution microscopy is a novel methodology that is used to obtain images of the structure of chromatin at the nano-scale.

EU Research


At a glance Full Project Title Unveiling the Roles of Chromatin Insulators in Higher-order Chromatin Architecture and Transcription Regulation one molecule at a time.

By contrast, single-molecule microscopy gives the unprecedented ability to discover the function of different proteins in chromosome organisation in each individual cell, and to investigate how chromosome organisation is modulated in time and in response to external cues. The production of quantitative models of chromatin structure and regulation from single-molecule data is an essential part of the project’s work, so that researchers can draw comparisons with

Single-molecule microscopy gives the unprecedented ability to discover

the function of different proteins in chromosome organisation in each

individual cell, and to investigate how chromosome organisation is modulated in time and in response to external cues results obtained from genome-wide approaches. “These comparisons will allow us to determine whether the chromosome is organised and regulated in the same way in different cells and study the role of heterogeneity in cell populations. Also, we will be able to determine the dynamic changes in chromosome architecture and transcription regulation during the life of a cell,” explains Dr Nollmann. The higher-order structure of chromatin is highly complex and determines the transcription profile of a cell, so the idea that different cells or cells in different stages of their life cycle would have different higher-order chromatin structure would not be entirely surprising. This research is mainly fundamental in nature, but Dr Nollmann nevertheless

www.euresearcher.com

expects that their results will have an impact on the study of disease. The techniques developed in the project are currently being applied to look at very small, simple systems to gain rigorous mechanistic insights, but that could be extended to the study of chromatin architecture and the role of insulator factors in entire animals to gain a wider perspective. “We can do experiments under those conditions and see what is different in the embryos with respect to single cells. In future, we could also look at mature flies and, for example, the expression of those genes in different tissues, to determine how insulator proteins orchestrate the organisation of chromatin and the transcription program leading to cell differentiation. So I hope that by the end of the project we will get a full picture of how these very fundamental mechanisms affect gene expression patterns in the whole organism,” says Dr Nollmann. With the project still in its early stages, Dr Nollmann says the methodology could evolve. “I hope that in future we can move more and more towards using these single molecule methods in tissues, even maybe live animals,” he says.

Project Objectives Our research aims to unravel the mechanism by which insulator bodies dynamically regulate chromatin structure and transcription by using single-molecule biophysics and quantitative modeling. This project has the potential to impact our understanding of several fundamental cellular processes, including transcription regulation, cell-cycle dynamics, higher-order chromatin organisation, and cell differentiation. Contact Details Project Coordinator, Dr Marcelo Nollmann Centre de Biochimie Structurale 29, rue de Navacelles 34090 Montpellier France T: +33 (0)467 417 912 E: marcnol@gmail.com E: marcelo.nollmann@cbs.cnrs.fr W: https://sites.google.com/site/ smbmontpellier2/insulator-biology http://www.ncbi.nlm.nih.gov.gate1. inist.fr/pubmed/21983085

Dr Marcelo Nollmann

Project Coordinator

Magnetic tweezers are a single-molecule manipulation technique that uniquely report on dynamic changes in DNA extension induced by the formation of long-range DNA interactions induced by the binding of proteins.

Dr Marcello Nollmann is a Principal Investigator at the Centre de Biochimie Structurale (CBS) in Montpellier, where he started his own laboratory in May 2008. Its main axis of research is the development and application of single-molecule methods to study the mechanisms by which DNA is moved and organised in the cell. Researchers have so far built a magnetic tweezers setup, and a one-color photoactivated localisation fluorescence microscope (PALM).

25


Cardiovascular diseases and related conditions are the number one cause of death worldwide. Using polymer materials to treat cardiovascular conditions in a less invasive way than currently deployed will boost both patients’ life expectancy and quality of life, as Project Coordinator Frank Stam and Professor Paul Herijgers of the Heart-e-Gel project explain

Using smart hydrogels for treating cardiovascular diseases Cardiovascular diseases and related conditions are the number one cause of death worldwide, particularly in older people, who are often unable to survive or undergo the invasive high-impact procedures required. If researchers can develop new, less invasive methods of treating cardiovascular conditions then

concept is to introduce a small hydrogel sample into a blood vessel via a catheter type delivery device and manoeuvre it to the intended treatment location. At its end destination the gel can then be allowed to swell until it fully occludes the vessel, or alternatively to fulfil another function.

The delivery system can electrically control the gel,

while it is introduced into the blood vessel and manoeuvred to the intended treatment location. At its end destination the gel can

then be allowed to swell until fully occluding the vessel this could boost both patients’ life expectancy and their quality of life, according to Professor Paul Herijgers, a cardiovascular surgeon working within the Heart-e-Gel project (www.heartegel.eu). The ambitious aim of the project is to develop new, minimally-invasive systems for cardiovascular applications, based on smart stimuli-responsive hydrogels. To realize this, a multi- and interdisciplinary approach is required. Hydrogels are not new in biomedical applications, as they can already be found in global applications such as soft contact lenses and drug delivery systems. The hydrogels developed within the project should possess a swelling and shrinking behaviour that can be controlled with an electrical charge. The

26

Electroactive hydrogel The Heart-e-Gel consortium brings together a range of different partners from both the academic and commercial sectors. A key goal is to get the gel functioning as cardiovascular surgeons require; Project coordinator Frank Stam says there are some significant technical challenges in this work. “For instance the hydrogel will not only need to be biocompatible in blood, but it also needs to be delivered and held in place to swell in dynamic flow conditions. Also very challenging is sustaining its swelling and position under the long-term pulsating blood pressure”, he outlines. “Since there was no ready-to-use hydrogel material available, part of the project work focuses on this aspect. In

EU Research


The Heart-e-Gel consortium brings together a range of

different partners from both the academic and commercial sectors.

A key goal is to get the gel functioning as cardiovascular surgeons require Tip of minimal invasive hydrogel delivery device

this respect, developing a gel which elegantly combines a quick enough swelling (minutes) with sufficient mechanical strength and fatigue resistance is one of the main material development challenges,” says Professor Peter Dubruel, who is leading the material development. “For electro-activating the gel, electrodes will be used, creating an electric field within the gel affecting the ion flow and its swelling capacity. At the moment pulsed electro-activation is investigated to shrink the gel, to be followed by passive swelling,” points out Professor Herijgers. Various electrode designs and configurations specific to the intended applications of the gel in cardiovascular surgery are currently being tested. The project team strongly believes that precise control of the swelling and shrinking will lead to significant improvements in treatment, making minimal invasive surgery available to a wider range of patients. While researchers have not yet been able to achieve the optimum material, Stam says they have obtained an understanding on how to tailor the behaviour of the gel. “We have undertaken extensive experimentation with many material formulations and found that the physical properties of the

www.euresearcher.com

27


At a glance Full Project Title Microsystem integration based on electroactive polymer gels for cardiovascular applications (HEART-E-GEL) Project Objectives The main objective is addressing the challenges associated with developing novel microsystems technology platforms to enable utilisation of smart hydrogels for minimal invasive treatment of cardiovascular diseases. Project Funding €2,750,000. The project is funded by the European Union Seventh Framework Programme FP7/2007-2013 under grant agreement n° 258909 Project Participants Professor Peter Dubruel, Polymer Chemistry and Biomaterials Group, UGent, Belgium • Dr Eduardo Mendes, Materials Characterisation, TU Delft, The Netherlands • Professor Yosi Shacham & Professor Viacheslav Krylov, Faculty of Engineering, Tel-Aviv University, Israel • Professor Paul Herijgers, Experimental Cardiac Surgery KU Leuven, Belgium • Dr Renzo Dalmolin, Industrial medical devices, Sorin CRM, France • Dr Hercules Pereira Neves & Dr Dieter Cuypers, Biomedical Microsystems, IMEC, Belgium Contact Details Project Coordinator, Frank Stam Tyndall National Institute Lee Maltings, Dyke Parade, Cork, Ireland T: +353 21 4904110 E: frank.stam@tyndall.ie W: www.heartegel.eu

gel change with the amount of swelling. Depending on how the swelling is obtained, its mechanical behaviour could be adversely affected and the gel could become brittle,” Stam explains. To mimic application-like conditions artificial circulatory flow systems are used to investigate the behaviour of the gels. For instance the effect of different hydrogel sample dimensions and surface-to-volume ratios could be investigated in relation to withstanding the pulsating blood pressure to dislodge the sample. “We are looking for answers on what expansions to use and what forces can be exerted on the vessel walls,” says Professor Herijgers.

Treating cardiovascular conditions There are a range of potential medical applications for the gel, but currently the project is focused on using it for occluding arteries. The system could also be used to fill any unwanted spaces within the cardiovascular system, which can occur in people suffering from aortic aneurysms. “The aorta is the largest blood vessel in the body. This can distend on certain occasions because of ageing, atherosclerosis or congenital abnormalities. And when it dilates there’s a risk of rupture,” explains

Professor Herijgers. It is sometimes possible to place a covered stent inside these dilatations so that they are excluded from the rest of the blood stream, but unfortunately treatment often fails because of leakage around the stents. “One idea is that you could fill this external part. The electro-activated gel would form around this covered stent, so that you completely fill this space and close the leak,” says Professor Herijgers. The current focus of the Heart-e-Gel project is on developing the overall system and ensuring it is physically and chemically stable, compatible with living tissue and capable of operating in a blood environment. The hydrogel development has reached an advanced stage, so that the various system elements can now be put together. “We will be assembling the implantable gels in a prototype delivery device containing electrodes which can be controlled from the outside,” outlines Professor Herijgers. Once the hydrogel achieves the pre-clinical vessel occlusion’ specifications and the concept is proven, the wider Heart-e-Gel technology platform can be utilised, adapted and custom designed for other valuable cardio-vascular applications and diversified markets.

Diagrams illustrating delivery and deployment of a gel device resulting in stemming of blood flow.

Frank Stam

Project Coordinator

Frank Stam received his MEngSc (Mechanical Engineering) in 1989 from University of Twente, The Netherlands. He then went to Digital Equipment Corporation (DEC), Galway, Ireland from 1989 until 1992 where he developed microelectronics interconnection processes for mainframe computers. Subsequently he joined the Tyndall National Institute, in Cork, Ireland where he became a senior research scientist in 1996 and setup a Biomedical Microsystems team in 2001 with an interest in implantable microelectronics.

28

EU Research


We’re stronger together, says Chair of research programme

The goal of finding a cure for neurodegenerative diseases is one of the ‘grand challenges’ facing European society. The scale of the challenge is beyond the scope of any single European country and demands a collaborative approach to research, says Professor Phillippe Amouyel of the EU Joint Programme on Neurodegenerative Disease Research (JPND)

www.euresearcher.com

29


T

here is currently no cure for debilitating neurodegenerative diseases such as Alzheimer’s, Huntington’s and Parkinson’s disease. With the number of cases projected to rise further as life expectancy increases, research into neurodegenerative diseases is a major priority. The scale of the challenge demands a collaborative approach to research, according to Professor Philippe Amouyel, Chair of the JPND Joint Programming Initiative. “The primary aim of JPND is to fight against the fragmentation of research efforts in Europe,” he explains. Research projects investigating neurodegenerative disease are currently ongoing within several European countries, but Professor Amouyel believes it is important to look beyond national borders towards wider collaboration. “The means to effectively fight these diseases will probably only come from a big continent like Europe, rather than from any one country. Targeting funding towards excellent, collaborative research, not only taking place in a single country but all over Europe, should really improve the efficiency of projects,” he says. The JPND programme is the pilot of a new collaborative ‘joint programming’ approach to research in Europe, bringing countries together in pursuit of shared goals that are beyond the scope of any single nation. This approach is designed to help address the ‘grand challenges’ facing European society, one of which is finding cures for neurodegenerative diseases. “The primary goal of joint programming is to allow a synergistic use of shrinking research budgets in the current difficult economic climate,” outlines Professor Amouyel. While finding cures for neurodegenerative diseases is JPND’s ultimate goal, there are also several other strands to their research agenda. “We include basic research, clinical research, and also social and healthcare research,” says Professor Amouyel. Basic and clinical research into the underlying mechanisms of neurodegenerative diseases is of course crucial to finding cures, but the programme is also pursuing research into healthcare and social care. “You have to help people to live with the disease until you can develop a treatment,” stresses Professor Amouyel. “We are encouraging research of the same quality in these different fields.”

The roadmap focuses on several neurodegenerative diseases including Alzheimer’s and related disorders, motor neurone disease, prion disease, Parkinson’s disease and related disorders, Huntington’s disease, cerebral ataxia and spinal muscular atrophy. Some conditions are excluded – for example multiple sclerosis, and diseases which are only neurodegenerative at the end of their progression. “JPND really focuses on neurodegeneration itself; we aim to understand why neurons develop problems,” explains Professor Amouyel. A disease which already affects 20 per cent of people over the age of 85, Alzheimer’s provides a useful example of a neurodegenerative disease which JPND is tackling. In the first description of Alzheimer’s in 1906 two types of abnormal brain lesions were found; senile plaques and neurofibrillary tangles, both of which are part of the current diagnosis of the disease. Researchers have since found that the senile plaques are comprised mainly of the amyloid protein whereas the neurofibrillary tangles are comprised of the tau protein. However, researchers don’t yet know which type or level of amyloid and tau triggers the development of Alzheimer’s. “Indeed, we have observed some people with a lot of plaques in their brain who don’t have any major problems” explains Professor Amouyel.

One of the greatest unmet needs in this area is a lack of knowledge regarding the factors that determine why some people are more likely to develop neurodegenerative diseases than others

Neurodegenerative disease This is reflected in the structure of the programme’s Scientific Advisory Board, which includes not only basic researchers, but also clinicians and social and healthcare researchers. Together with representatives from industry and patient associations, the board launched a strategic research ‘roadmap’ in February this year, which outlines the research priorities to be addressed over the next ten years in Europe. “It’s a living document,” stresses Professor Amouyel. “As science continually evolves, we need to keep it updated.”

30

Improving Diagnosis and Risk Factors The accumulation of amyloid in the brain begins well before the first symptoms of Alzheimer’s manifest themselves, so the disease is usually well-established by the time a patient is diagnosed, making it even more difficult to treat. Improving diagnosis of neurodegenerative diseases is thus one of the major areas JPND is keen to address. “How can we obtain earlier diagnosis? How can we test for a disease which hasn’t yet expressed symptoms?” asks Professor Amouyel. “To do this, we need greater understanding of what is behind the development of the disease.”

One of the greatest unmet needs in this area is a lack of knowledge regarding the factors that determine why some people are more likely than others to develop neurodegenerative diseases. Therefore, an immediate need is to uncover the genetic and environmental risk and protective factors which are associated with neurodegenerative diseases. Although family history and genetic background can be important factors to determine risk, so too can environmental and behavioural factors, which may also determine protection from, or resilience to, neurodegenerative diseases. “The relationship between genetic, epigenetic and environmental risk factors and their relative importance will need to be established so that those factors that can be modified to delay or prevent disease can be identified,” Professor Amouyel says. Indeed, JPND has recently agreed to launch a call for research proposals in this area.

EU Research


www.euresearcher.com

31


Targeting funding towards excellent, collaborative research, not only taking place in a single country but all over Europe, should really improve the efficiency of projects Treatments Once a n individual has been identified as being susceptible or a possible early case, ‘biomarkers’ can be used to track the development of the disease. “We can now measure biomarkers in the brain and cerebrospinal fluid during the lifetime of a patient – using MRI and PET scanners – to both identify earlystage cases and also to track the disease,” explains Professor Amouyel. The JPND programme launched an action in 2011 in precisely this area, looking at the standardisation and harmonization of biomarkers across Europe.

32

Identifying people who are susceptible to neurodegenerative diseases is an important area for research, but needs to be combined with improvements in treatments – i.e. identifying targets for new and improved drugs to treat diseases. Researchers have already identified many similarities and parallels which relate neurodegenerative diseases to one another on a sub-cellular level. Discovering more of these similarities offers hope for the development of treatments that could ameliorate many diseases simultaneously. However, drug development is typically a long drawn-out process and very few new drugs have reached the marketplace over recent years. For example, two major pharmaceutical companies, Johnson & Johnson and Pfizer, recently halted Phase 3 clinical trials of a new Alzheimer’s drug called bapineuzumab, which targets and reduces concentrations of beta amyloid in the brain.

EU Research


“The hypothesis seems good, the treatment seems efficient, but it doesn’t seem to have any effect on cognitive function. It seems that it is not enough just to wipe out amyloid from the brain – if a drug doesn’t lead to improvements in cognitive function, then it does not proceed to market,” says Professor Amouyel. Some researchers have suggested that maybe the treatments were being delivered too late to have any real impact on cognitive function. The first symptoms of Alzheimer’s can appear as late as fifteen years after the beginning of the disease. It may be that in this period the brain is irretrievably damaged. “The idea is that maybe we could test these treatments earlier, before the first symptoms appear,” explains Professor Amouyel. One potential new approach to Alzheimer’s treatment could be to develop a vaccine – where the body uses its immune system to fight against infectious disease. “When a microbe or virus enters your body it is immediately recognised and you develop antibodies that eliminate it. The drugs being developed by Johnson & Johnson and Pfizer were based on passive immunisation, where ready-made antibodies are introduced into the body. Instead, a vaccine could be based on active immunisation, where a form of amyloid is injected into an individual, and their body will develop antibodies which will travel to the brain and try to eliminate amyloid,” outlines Professor Amouyel. New treatments or vaccines remain a long way off however, and in the meantime patients and their families still have to live with the effects of the disease, an issue which is also being addressed in JPND. “We have social and healthcare researchers looking into how to help not only the patients, but also their families, have a better quality of life” says Professor Amouyel. “Caring for somebody with a neurodegenerative disease can be totally exhausting. The best place for early-stage patients to preserve a high quality of life is at home. However, many people still need to be cared for day and night, and even for the most devoted husband, wife or daughter,

this can take its toll. This is not about separating or institutionalising people with neurodegenerative diseases, it’s about helping both patients and carers have a better life.” A second JPND call for proposals, to be launched later this year, will look at assessing and comparing the policies, strategies and interventions related to caring for people with neurodegenerative disease, with regard to quality, access and cost-effectiveness. Examples of areas to be evaluated in this call include care pathways, psychosocial interventions, end-of-life strategies, and educational programmes that benefit not just people with neurodegenerative diseases, but also their carers and families. With the JPND roadmap published and momentum gathering, the programme is now implementing its priorities through a range of large-scale initiatives. The first phase of implementation has already begun through an annual series of actions to be launched over the next three years. The two JPND Calls later this year are the first step in this process. “JPND is not just about calls for proposals,” explains Professor Amouyel. “Priority areas such as Animal Models and Assisted Living Technologies are also being assessed through specific Action Groups. In addition, JPND is investigating partnerships with industry and other international research initiatives to implement its research strategy, and is encouraging more structured, dedicated national action plans from each participating country.” “With the best scientists all over Europe working together, we can tackle all of these major diseases at the same time to bring about real and immediate improvements for patients and their families.” More information on JPND is available on the JPND website –

www.jpnd.eu

Professor Phillippe Amouyel of the EU Joint Programme on Neurodegenerative Disease Research (JPND)

www.euresearcher.com

33


Professor André Nollkaemper, founder of the Amsterdam Center for International Law (ACIL), Dr Hege Elisabeth Kjos, Assistant Professor at the University of Amsterdam, and Drs Martine van Trigt, Management Assistant at the ACIL, discuss their research on International Law through the National Prism: the Impact of Judicial Dialogue; a collaborative research project whose principal institution is the ACIL.

Opening Dialogues Between Domestic and International Law The question of whether domestic courts should refer to case law from an international platform has sparked a great deal of debate within the legal community worldwide. The overall aim of this project is to investigate and provide systematic analysis of the extent and effect of transnational judicial dialogues on international law. Professor André Nollkaemper, the project leader, elaborates on this: “International law is a body of laws applying to multiple states at the same time. These laws eventually have to be interpreted and applied at a domestic level. During that process, national courts, government ministries, and legislators arrive at their own national interpretation and application of a rule of international law. What happens more often than not is that different states interpret and apply international law in different ways. This is where judicial dialogue comes in. It appears as if courts more frequently look towards what other courts in other states have done

34

when it comes to the way they interpret a particular rule of international law.” The ACIL team offers some instances of domestic court decisions where the judges referred to a decision made by foreign or international courts. Cases such as Lawrence v. Texas and Roper v. Simmons draw upon international laws when considering consensual homosexual conduct and the execution of juvenile

Lozano v. Italy the immunity of a US soldier for acts committed in Baghdad, Iraq was upheld after the Italian Court of Cassation conducted a comprehensive analysis of the jurisprudence of both national and international courts. These are just a few examples of such instances that form the backbone of the research; the project will gather a broad spectrum of examples on which to base its analysis.

What happens more often than not is that different states

interpret and apply international law in different ways. This is where judicial dialogue comes in murderers respectively. “Yet in both cases there were strong dissenting voices,” Dr Hege Elisabeth Kjos tells us. Within the US Supreme Court there are those who view foreign law as “persuasive authority” in certain cases, whereas opposition calls for foreign laws to be “categorically barred from constitutional adjudication”. In

The project will study court methodology and motivation in engaging in these dialogues as well as identifying ‘best practices’ to ascertain the conditions under which a domestic court should refer to foreign interpretations of international law. Professor Nollkaemper explains: “We are approaching this academically, but I think

EU Research


it’s fair to say that many international lawyers, myself included, think that in principle judicial dialogues can be beneficial given the interdependence between states and the fact that international law applies across states. In time we hope to assess not only the benefits, but also the risks and make courts more aware of what they are doing.” Judicial dialogue on international law issues raises the additional question of what effect it may have on the international legal order further down the line. “For this purpose,” says Dr Kjos, “the project also examines the role domestic court cases play in decisions by international courts and tribunals.” The project builds upon another study which has been carried out at the University of Amsterdam for the last ten years, International Law in Domestic Courts; this project is essentially a database compiling case reports on a global level. The project has access to worldwide national case law and Professor Nollkaemper is confident that the access they have will allow the research team to address the project issues with a reasonable degree of certainty. The advantages of utilising the existing database are that it gives the team access

to plenty of materials and also provides a pre-existing framework into which to add further cases. Another clear benefit is the project’s collaborative nature. The ACIL cooperates with four other European universities in this European Collaborative Research Project (ECRP) in the Social Sciences funded by the European Science Foundation (ESF). In separate, yet complementary ways, each university contributes to the overall goal of the project with a view to more fully understanding the various aspects and effects of judicial dialogue. At the project’s conclusion, the gathered information will be disseminated in a number of ways as Dr Kjos explains: “There will be articles published on various topics and we have already had a conference in Vienna on the law surrounding immunities. There will also be a mid-term conference in Oslo in the summer of 2013, followed by a final conference in Amsterdam; the result of these conferences will be a collection of edited volumes containing conference papers and proceedings.” In addition there will be an outreach scheme involving judges on an international scale; it aims to raise awareness with judges and legal practitioners and further involve them in the process.

At a glance Full Project Title International Law through the National Prism: the Impact of Judicial Dialogue Project Objectives Judicial dialogues on international law issues allow domestic courts to arrive at better responses to shared problems. This international collaborative research project identifies ‘best practices’ to ensure that court-to-court dialogues are conducted in accordance with sound criteria, strengthening the role of courts as guardians of the rule of law. Project Funding The project is funded by the European Science Foundation (ESF) as a European Collaborative Research Project (ECRP) in the Social Sciences. Contact Details Drs Martine van Trigt, Amsterdam Center for International Law Faculty of Law University of Amsterdam P.O. Box 1030 NL-1000 BA Amsterdam T: +31 (0) 20 525 2961 E: acil-fdr@uva.nl W: www.acil.uva.nl

Professor André Nollkaemper

Project Leader

Professor P.A. (André) Nollkaemper is Professor of Public International Law and Vice-Dean for Research at the Faculty of Law, University of Amsterdam. He is also (external) Advisor to the Ministry of Foreign Affairs of the Netherlands and Member of the Board of the European Society of International Law.

www.euresearcher.com

35


A new look at Latin American nationhood

Wars of independence and post independence flared across Latin America in the early 19th century, eventually leading to the overthrow of the former colonial masters and the formation of new nations. While the new nations had their roots in the same cultural heritage they developed very differently, as Juan Carlos Garavaglia of the Statebglatamerica project explains The early 19th century was a tumultuous period in Latin American history as wars flared throughout the region, eventually resulting in the overthrow of the former colonial masters and the formation of new nations. Based at the Universitat Pompeu Fabra in Barcelona, Professor Juan Carlos Garavaglia is studying the period. “The

36

Statebglatamerica project’s principal objective is to study the process of statebuilding and the invention of the nation in Hispanic America,” he outlines. This process had its roots in the same language, religion and legal culture, demonstrating that separate nations can be formed from a shared history, but the beginning of Latin

American wars of independence were at least partly inspired by external events. “Some of the intellectuals belonging to the white elite that led this process of statebuilding had read several of the most widely disseminated authors that shaped both the North American and French revolutions, such as Rousseau, Voltaire, Jefferson, and

EU Research


Equador, Archivo Histórico del Ministerio de CulturaReserva Militar, Quito, 1910

particular on seven nations; Argentina, Chile, Colombia, Costa Rica, Equator, Guatemala, and Uruguay; and two Argentinian provinces; Buenos Aires and Santa Fe. “The period under consideration comprises the first sixty years after independence, roughly 1810 to 1870. Namely, from the Wars of Independence until the War of Paraguay (1865-1870)” outlines Professor Garavaglia. The newly independent nations were forged out of the embers of conflict, so the military held a powerful position (with the exception of Chile and Costa Rica), while the conflicts also caused social disruption and affected labour relations. “The need for men to fight meant that in some areas, including Mexico, Venezuela and Rio de la Plata, labour relations bore a different mark to those of the colonial period,” explains Professor Garavaglia. “War persisted even after the final defeat of the royal armies on the continent in the battle of Ayacucho in 1824 – Cuba and Puerto Rico remained colonies. In several areas such as the Rio de la Plata, Colombia, Venezuela and Mexico, wars were an everyday occurrence in the lives of the urban populace and the peasants until the end of the period we are analysing.”

Governance and administration

Spanish political philosophers, as Vitoria, Mariana and Suárez,” says Professor Garavaglia. “But the Hispanic American revolutions for independence broke out, above all, as a consequence of the dynastic rupture that occurred in Spain when Napoleon intervened in the Iberian Peninsula. This resulted in a royal vacancy that produced a breach in legitimacy and cleared the way for the formation of Junta’s in Spain and America.” The royal vacancy sparked demands for sovereignty to be ‘returned’ to the people, with unrest eventually erupting into war throughout Spain’s American colonies. The Statebglatamerica project is focusing in

www.euresearcher.com

Following independence the new nations of Latin America developed systems of governance and administration. With the exception of Brazil the new nations were organised as republics, which was a major shift from the previous monarchical system. “Symbolic reconstruction had to be carried out in Hispanic America, in order to transform loyalty to a king as a subject to loyalty to a republic as a citizen,” explains Professor Garavaglia. This process continued throughout the period studied by the project, alongside the development of representative government. “A republican system presupposes the creation of forms of representing citizens and in the case of Hispanic America, very diverse electoral systems were rapidly created,” says Professor Garavaglia. “In some cases, such as Buenos Aires from 1821 on, they had far reaching implications, involving an extensive body of male voters. But the electoral systems varied – the majority were based on censitary suffrage, which means that only a reduced minority of adult men had the status of citizen with the right to vote, as was the case in Spain, France, and England during the same period. Another characteristic that accompanied this republican process was the drawing up of constitutions, following the example of

Argentina, Archivo Julio Marc Proclama militar, 1859 the three written constitutions that existed at the time, the American (1776), the French (1791), and the Spanish, which was drawn up in Cadiz in 1812.” The continuation of conflict throughout the period being studied by the project also demanded an efficient bureaucracy to find and allocate resources. State finance is a central issue for any nation, particularly in a time of ongoing conflict, and Professor Garavaglia’s group has studied Treasury and Department of War reports from each of the selected countries. “Most of the financial resources came from foreign trade – especially

Equador, Archivo Histórico del Ministerio de Cultura. Gabriel García Moreno y Monseñor Checa, Quito, 1880

37


At a glance Full Project Title A Comparative Study of the State Building process in Latin America (1820-1870) Team Members Claudia Contente and Evangelina de los Ríos, Argentina. Mario Etchechury, Uruguay. Elvira López Taverne, Chile. Pilar López Bejarano, Colombia. Pablo Rodríguez Solano, Costa Rica. Juan Carlos Sarazúa, Guatemala. Viviana Velasco Herrera, Equador. Associated Members: Juan Pro Ruiz and Enric Ucelay da Cal, Spain. Rodolfo González Lebrero and Alejandro Rabinovich, Argentina. Pierre Gautreu, France. Project Funding Advanced Grant N° 23246, ERC European Research Council, 2008-2013 Project Partners UPF, Barcelona and Institució Catalana de Recerca i Estudis Avançats, ICREA, Generalitat de Catalunya Contact Details Principal Investigator, Professor Juan Carlos Garavaglia Universitat Pompeu Fabra, Barcelona T: 00 34 93 542 1421 E: jcgara@hotmail.com W: www.statebglat.upf.edu

Professor Juan Carlos Garavaglia

Project Coordinator

Juan Carlos Garavaglia is now ICREA research professor at the Universitat Pompeu Fabra, Barcelona and directeur d’éudes at the Ecole des Hautes Etudes en Sciences Sociales, Paris. He published several books on agrarian and economic history of Latin America. He is currently working on state building in the Nineteenth Century Latin-American history.

38

from the taxes on imported merchandise for mass consumption – and from a series of internal taxes that were inherited from the colonial period, such as monopolies on salt, tobacco and liquor, as well as tithes and tributes paid by Native Americans. That is to say that one way or another, it was the masses that bore the costs of financing the State,” he outlines. The newly forming states had to deal with the issue of taxation; in some places, such as Río de la Plata, the economy was reoriented towards the Atlantic and the tax system became dependent on foreign trade. “More than 85 per cent of taxation in the two Río de la Plata states depended almost exclusively on foreign trade and especially on the level of imports, to the point where taxes fell mostly on imports and not exports. This meant they were really paid by consumers when they acquired imported merchandise,” explains Professor Garavaglia. It is clear that this manner of financing the State through taxes on imports staved off disputes about the legitimacy of taxation and therefore the very legitimacy of the State for a time. The empire of Brazil and Venezuela generated a similarly high proportion of their revenue from foreign trade, approximately 70 per cent, while in Chile

Equador, Archivo Histórico del Ministerio de Cultura. Aurelio Salvador y otros, Guayaquil 1875

wheat, cochineal, indigo and coffee were among the agricultural products, while hides, salted meats and wool were among the products derived from livestock,” outlines Professor Garavaglia. “Minerals such as silver, gold, and copper were particularly important for the cases of Chile, Peru, Bolivia, Coluimbia and Mexico.”

Most of the financial resources came from foreign trade – especially from the taxes on imported merchandise for mass consumption – and from a series of internal taxes that were inherited from the colonial period, such as monopolies on salt, tobacco and liquor and Peru this source of income accounted for at least half the state budget. However, in other nations, including Bolivia, Peru and Ecuador, the indigenous population paid a substantial part of the budget through various forms of taxation, showing the extent to which the legacy of the colonial tax system remained present in many IberoAmerican states in the mid-19th century. “In many countries, such as Costa Rica, Guatemala and El Salvador, as well as Ecuador and Colombia, governmentlicensed monopolies continued to be one of the pillars of tax collection until the end of the 19th century,” says Professor Garavaglia. Economic development is an important element of state-building, and international trade became crucial to the new nations of South America. “For Costa Rica, Guatemala, Equador, Venezuela, Brazil, Argentina and Uruguay, agricultural and livestock products were fundamental: cacao, sugar,

Nation building This level of trade helped the new nations of Latin America establish themselves economically, but it was not until later in the 19th century that they were more fully integrated into the world economy, with the skills of their people and natural resources key to the development of national wealth. Having extensively studied finance, bureaucracy and the weight of war, Professor Garavaglia and his colleagues now plan to move on to study the ‘fourth foot’ of nation building – the judicial system – in each of the selected cases. “We have already published four books about these subjects and now await the publication of two volumes in English about bureaucracy and civil servants, comparing America, Europe, and Asia. We also have seven other studies, which we will turn into books on each of the national cases,” he says.

EU Research


Making waves in the energy market: Has the tide turned ? With around 70 per cent of the earth’s surface covered by saltwater oceans, it is hardly surprising that there is a great deal of interest within Europe into the technical challenges posed by extracting renewable energy from the waves and tides that often batter many European coastlines

www.euresearcher.com

39


T

wenty-four countries have deployed ocean energy conversion systems, comprising 19 countries with installed wave systems and 13 countries with installed tidal stream systems.

its kind in the world to provide developers of both wave and tidal energy converters with purpose-built, accredited testing facilities. EMEC was established following a 2001 recommendation by the House of Commons Science and Technology Committee.

However, according to MARINET (the Marine Renewables Infrastructure Network) most marine renewable energy systems are still largely at the pre-commercial stage of development.

The technologies being tested at EMEC in some of Europe’s harshest marine environments will help developers whose aim is to generate electricity by harnessing the power of waves and tidal streams. EMEC offers 14 full-scale test berths - eight for tidal concepts at the Fall of Warness off the island of Eday, and six for wave concepts at Billia Croo in Stromness - taking advantage of ideal locations in Orkney, which has an excellent oceanic wave regime and strong tidal currents. Orkney also offers excellent harbour facilities and, according to Max Carcas, EMEC external liaison manager, now benefits from a ‘cluster’ effect, caused by the presence of EMEC.

MARINET is an €11million, four-year, network initiative which is majority-funded through the EC’s Seventh Framework Programme (FP7). The network has 29 partners with 42 specialist marine research facilities spread across eleven EU countries and one FP7 partner-country, Brazil. It covers all aspects of offshore renewable energy development - wave energy, tidal energy, offshore wind energy, commons issues such as electrical/grid research, and environmental and resource data. The potential in global market value for wave energy and tidal energy is truly enormous. According to an Altprofits report, the potential market for wave energy alone is worth about $1 trillion worldwide, according to a World Energy Council forecast, and the global tidal range energy potential is estimated to be about 2.5-3 terrawatt hours (TWh), with 1TWh being available at comparably shallow waters. The potential worldwide market for marine power could be 2-4000 TWh a year, roughly equal to current nuclear and hydro-electric power contributions. In the UK, marine power has the potential to provide one fifth of the total power demand of 350TWh. The World Energy Council also predicts that wave technology alone could supply 6.5 per cent of the energy needs of the United States of America.

“This is a really positive benefit to us and to Orkney. For example, there used to be one workboat that could handle operations at sea - now there are at least three, with operators gaining experience in different conditions and using different pieces of kit. We have also seen the growth of environmental consultancies, whose activities can only enhance the work going on here,” he said. Indeed, a report by Scottish Renewables highlighted over 11,000 jobs across the renewable energy industry in Scotland. Of the 521 marine energy jobs identified, two-thirds of them are in the Highlands and Islands and nearly half are in Orkney alone. Scotland is now home to some of the most advanced wave and tidal technology developers in the world: Pelamis Wave Power, Aquamarine Power, Voith Hydro Wavegen, AWS Ocean Energy and Scotrenewables. Many of these companies are EMEC clients.

In both wave and tidal energy sectors there are many concepts but to develop these into successful commercial operations requires extensive testing at all scales - from concept and theory through to small-scale model, intermediate and full ocean scales.

However, none of EMEC’s test berths would be attractive to developers if it was not for the existence of grid connections. Four years earlier at EMEC, In 2004, Pelamis Wave Power’s P1 machine became the first floating wave energy converter to generate to the grid anywhere in the world and in 2008 OpenHydro were the first tidal energy convertor to generate to the UK grid.

The European Marine Energy Centre (EMEC) Ltd, based in Orkney, off the northern-most tip of the Scottish mainland, will in 2013 celebrate its tenth anniversary as the first - and only - centre of

In 2010, EMEC expanded the test site, laying new cables and enhanced its cables by installing new cable end terminations, allowing developers faster, safer and easier connections.

We’re putting machines in areas you would normally try to avoid with strong tides and big waves. It is key that machines are designed to cope with both the extremes and fatigue loading that is encountered in this environment

40

EU Research


EMEC can now boast it has deployed more grid-connected marine energy converters than any other single site in the word. “Connectivity is a big issue for marine projects,” says Max, “ as it can vary from country to country. For example, a lot of Portugal’s grid capacity is along its coast, whereas the UK grid tends to peter out going north, although there are existing links to the Western Isles - and Orkney, which are now fully utilised. Establishing new links requires a large investment.” EMEC has two full-scale sea-bed test sites, each approximately 2km by 4km, with at least 50m between each berth. The full-scale test sites are sited in areas with harsh conditions. It also has scale test sites for smaller scale models, or technique or component testing in more benign conditions. At the full-scale test sites, the water depth is up to 70m for wave power installations, and at the tidal test site berths range from 12-50m depths.

The impact of tidal and wave power installations on the abundant wildlife in Orkney waters is monitored through the EMEC Monitoring Advisory Group, which carries out environmental studies to ensure the consistent collection of data. This removes the likelihood of inconsistent data if the studies are carried out by individual developers. Sea mammals and fish are observed to see, for example, if the pitch and frequency of generators, and tip speed noise from tidal turbines, is having an impact on wildlife. It is worth noting, however, that tip speed noise has to contend with the natural noise generated by heavy seas and waves. EMEC is also at the forefront in the development of international standards, having co-ordinated the development of a suite of 12 industry guidelines, six of which are being progressed for global adoption as the first international standards for marine energy. There is also business activity in the embryonic marine renewable industry. In September 2012, Rolls-Royce plc announced it was selling Tidal Generation Ltd (TGL) to Alstom of France.

Sea conditions are important, but they vary. Tidal conditions are very predictable, linked as they are, to the lunar cycle. Wave Two years earlier, TGL’s 500kW tidal stream conditions result from the time averaged turbine concept demonstrator unit was deployed transfer of wind energy to the sea. Wave at EMEC and generated more than 200 MWh of energy levels are thus quite forecastable, electricity supply to the grid, becoming the first having been produced as the result of Fitting EMEC purpose-built cable ends EMEC-located project to receive Renewable historic weather conditions in the midObligation Certificates, which is how utility Atlantic, which then take 2-3 days to reach companies have to meet their commitments to renewable energy. the coast. Both resources are, therefore, a good fit for electrical network operators, offering to increae the penetration of Now, as part of the ReDAPT (Reliable Data Acquisition Platform renewables into the mix, while minimising balancing costs. for Tidal) consortium project, which is commissioned and cofunded by the Energy Technologies Institute, TGL’s 1MW tidal “We’re putting machines in areas you would normally try to stream turbine is currently being installed off Orkney to be tested avoid,” says Max, “with strong tides and big waves. It is key that in various operational conditions over the next two years. machines are designed to cope with both the extremes and fatigue loading that is encountered in this environment.” Through its links with industry, government and academia, EMEC is at the heart of marine renewable research. It has links Some tidal devices are bolted to the sea bed, others weighed down with universities through the SuperGen UK Centre for Marine with ballast, while other wave devices are tethered. Energy Research whose core consortium consists of the University of Edinburgh, the University of Strathclyde, Queen’s University The sea bed is managed by the Crown Estates. EMEC, which has Belfast and the University of Exeter. The consortium includes the received blanket approval from the Crown Estates for its berths, associate universities of Plymouth, Heriot-Watt, Lancaster, has pioneered the process of gaining the necessary consents and Manchester, Swansea, Oxford and Southampton. permits for its clients and takes these clients through the process.

www.euresearcher.com

41


Co-located at Stromness in Orkney, where EMEC also has its offices and data centre, Heriot-Watt University has established the International Centre for Island Technology (ICIT), a specialist arm of the Institute of Petroleum Engineering. ICIT was established in 1989 to carry out advanced research, postgraduate training and consultancy in marine resource management and related issues. Research at all the universities in the SuperGen programme is expected to make a significant contribution to the marine renewable industry. “Every technology has gone through cost evolution,” explains Max. “For example, coal-fired power generation in the 1920s cost in real terms ten times what it costs today. And wind turbine costs are now 80 per cent less than they were in the mid-Eighties because costs fall as technology is deployed. “Typically you can see a 15-20 per cent reduction in cost for each doubling of installed capacity as costs are driven down through economies of scale. Wave and tidal energy’s ‘opening’ costs are pretty similar to where onshore wind started out but they have the potential to be very competitive, particularly when compared to offshore wind’s current costs. The current challenge and focus of the sector is to move from single to multiple installations.” EMEC’s test berths are currently single or double installations. E.ON has secured a site north of EMEC for a 50 megawatt (MW) wave farm, using up to 66 Pelamis Wave Power Ltd machines in an array connected to the UK grid. In 2009, E.ON became the first utility company to buy a wave power machine, with a Pelamis 750 kilowatt (kW) P2 machine, which comprises five connected sections that flex and bend in the waves. This movement is harnessed by hydraulic rams at the joints which, in turn, drive electrical generators located inside the device. The P2, which is 180m long, four metres in diameter and weighs about 1,350 tonnes, was installed at the Billia Croo wave test site in October 2010, where it operates alongside a similar P2 machine owned by ScottishPower Renewables.

42

The two utilities have a working agreement to maximise lessons from operating and maintaining their machines as a wave farm. A three-year test programme has been structured to test the P2 over a defined time in various weather states, each with progressively higher wave heights. This knowledge will be used by E.ON in the development of its 50MW wave farm. ScottishPower Renewables is also advanced with its Islay Tidal Array project, using ten Andritz Hydro Hammerfest devices – the UK’s first consented tidal array project, to be completed in 2015. This followed the successful installation at EMEC of its 1 megawatt (MW) pre-commercial tidal turbine destined to validate the technology for future tidal power arrays in December 2011. It delivered its first energy to the grid in February 2012. EMEC, says Max Carcas, would like to support its clients on array technological testing. “Some people say what we are doing is a bit like rocket science - in some ways it is and in some ways it isn’t. Capturing energy from the sea is not as difficult as putting a man on the moon, however it does involve in dealing with a lot of ‘unknown unknowns.’ In the same way the space race produced a number of innovative technologies, research on wave and tidal energy also has the potential to produce a lot of useful spin-offs. The challenge is firstly an engineering challenge - to make something that works and is reliable. “Linked to this is the commercial challenge - to get private sector investment into the technologies and projects to enable this; then linked to this is the political challenge of getting the right mechanisms to stimulate this investment. While it may not be rocket science, perhaps we should follow the example of the space race and make a real Europe-wide commitment to harnesssing this large-scale indigenous energy resource and the research and engineering resource required to achieve this within the next 5-10 years.”

www.emec.org.uk

EU Research


© NASA

Thermodynamics of the climate system The NAMASTE project is developing new theoretical approaches and new diagnostic tools to analyse the structural properties of the climate system, work which could help improve our understanding of the global climate, its properties in past and present conditions, and how it may evolve in the future. We spoke to scientific coordinator Professor Valerio Lucarini about the project’s research and his plans for the future The climate system

consists of four intimately interconnected sub-systems, atmosphere, hydrosphere, cryosphere, and biosphere, which evolve under the action of macroscopic driving and modulating agents, such as solar heating, Earth’s rotation and gravitation. The climate system features many degrees of freedom - which makes it complicated – and nonlinear interactions taking place on a vast range of time-

www.euresearcher.com

space scales accompanying sensitive dependence on the initial conditions – which makes it complex. As a result of the complex nature of the climate system, even the most sophisticated climate or weather forecast models cover only a relatively small portion of the range of scales where variability is observed, and, often, one needs to select different approximations and even different

formulations of the relevant dynamics depending on the problems under investigation. Moreover, it is common practice and actual necessity to develop parameterizations for taking into account the effects of the unresolved small scales processes on the scales explicitly resolved by the model. Such parametrizations increasingly contain a mixture of stochastic and deterministic formulations.

43


General properties The project combines the investigation of rather theoretical aspects of climate dynamics with the development of practical tools that can be used to improve and test climate models, which are fundamental for understanding the general properties of the climate system. However, assessing their quality is a challenging task; Professor Lucarini believes that in order to test robustly the models’ performance it is important to select carefully the so-called physical observables, i.e. quantities that define relevant properties of the climate system and account for its dynamical behaviour and response to perturbations. “In order to choose good observables we need to go back to fundamental scientific concepts and apply them to the investigation of the climate system. This is why I think that feeding in thermodynamic ideas will give us exactly the kind of basic tools and ideas we need to define these good climate observables,” he explains. The project’s focus is on understanding

thermal engine. “The climate system dissipates kinetic energy and destroys energy availability through irreversible latent and sensible turbulent heat fluxes and mixing. The result of this is the production of entropy, occurring through a vast variety of processes (Fig. 1),” says Professor Lucarini. The NAMASTE project has developed diagnostic tools to analyse various aspects of the performance of this ‘climate engine’. “In this sense, it is possible to define the efficiency of the climate engine, just like one does in elementary thermodynamics when introducing the Carnot engine. The diagnostic tools we are using are able to determine the climate’s total entropy production and to quantify its efficiency, in a sense the most fundamental properties of a non-equilibrium irreversible system,” says Professor Lucarini. “By computing the degree of irreversibility of the system and the performance of its engine, we can define fundamental benchmarks to characterise how the climate system is described by various

Figure 1: Minimal conceptual model accounting for the material entropy production of the Earth system . Boxes 1 and 2 represent warm (low latitudes) and cold (high latitudes) atmospheric domains, coupled by energy transport. Boxes 3 and 4 represent warm (low latitudes) and cold (high latitudes) surface domains, coupled energetically to the corresponding atmospheric boxes above. Adapted from Lucarini et al. (2011). Lucarini, V., K. Fraedrich, and F. Ragone, 2011: New results on the thermodynamical properties of the climate system. J. Atmos. Sci. 68, 2438-2458.

In order to choose good observables we need to go back to fundamental scientific concepts and apply

them to the investigation of the climate system. This is why I think that feeding in thermodynamics

ideas will give us exactly the kind of basic tools and ideas we need to define these good climate observables

the general properties of the climate system, but Professor Lucarini is also interested in some specific features that can be recognised phenomenologically. “In particular I’m referring to the large-scale features, in time and space, like the monsoon; I’m getting interested in the South Asian climate,” he says. This research requires sophisticated diagnostic tools that correspond to very precise basic ideas on how the climate works. From a macroscopic, general perspective, the climate system can be viewed as a nonequilibrium system, which produces entropy through irreversible processes and converts available potential energy into kinetic energy like a

44

models and the climate’s response to changing conditions.” Researchers in NAMASTE are exploring a wide range of variations in parameters like atmospheric CO2 concentration or solar irradiance to thoroughly explore climate change and variability. It is important in this work to take account of fundamental problems in climate dynamics research. “On one side, atmospheric and oceanic motions are driven by spatial differences in the incoming radiation, and on the other the very same motions statistically equilibrate the system and partially compensate the imposed temperature differences. So the climate is less warm in the tropics than it would be and the poles

Figure 2: It is possible to construct mathematical tools helping us to compute rigorously climate change. Reported in this figure are the Green functions for the total energy and momentum of the simple atmospheric model proposed by Ed Lorenz in 1996. These functions can be used to project in the future the changes of the considered physical quantities due to the perturbation (forcing) applied to the system. Adapted from Lucarini and Sarno (2011). Lucarini, V., and S. Sarno, 2011: A Statistical Mechanical Approach for the Computation of the Climatic Response to General Forcings. Nonlin. Proc. Geophys. 18, 7-28.

EU Research


Figure 3: Non-equilibrium statistical mechanics provides tools for constructing parametrizations for fast processes occurring at small scales, which cannot be explictly represented by global climate or weather forecast models. We report here some pictorial representation of impact of small-scale processes onto the largescale processes reminiscent of Feynman’s diagrammes. Adapted from Wouters and Lucarini (2012). Wouters, J. and V. Lucarini, 2012: Disentangling multi-level systems: averaging, correlations and memory, J. Stat. Mech. P03003.

Figure 4: The efficiency (η) of the climate system decreases at the tipping points, as the system collapses to a state closer to equilibrium. Here we present the transition of the Earth’s climate between the Warm (W) and the snowball (SB) state triggered by changes in the solar insolation (S*). Adapted from Lucarini et al. (2010). Lucarini, V., K. Fraedrich, and F. Lunkeit, 2010: Thermodynamic Analysis of Snowball Earth Hysteresis Experiment: Efficiency, Entropy Production, and Irreversibility. Q. J. R. Meterol. Soc. 136, 2-11.

www.euresearcher.com

are less cold than they would be in absence of the large scale motions of the planetary fluid envelope,” explains Professor Lucarini. A related issue the project is looking at is heat transport from low-latitude to high-latitude regions and how this transport changes in different climate conditions; while this is complex work, Professor Lucarini says his research team has found fundamental laws connecting the meridional heat transport, climate efficiency, and the intensity of the Lorenz energy cycle. “There is an overarching form, an overarching idea, putting this picture together. However, there is still a huge amount of research to be done,” he cautions. This includes research into the other atmospheric processes that produce entropy, such as water transport and the irreversible transport of heat from warm to cold regions. However, Professor Lucarini says the hydrological cycle is the most important factor contributing to entropy production in the climate system in its current condition. “One could estimate that the time and space averaged material entropy production in the present climate is in the order of 50 milliWatts per Kelvin per square metre. About 70 per cent of this figure is due to the irreversible water phase changes relevant to the hydrological cycle,” he outlines. This research can help scientists define the degree of irreversibility in the climate system. “We have studied how changing levels of solar radiation and opacity in the atmosphere affect the climate,” continues Dr Lucarini. “When changing the parameters, we consistently see that when the climate gets warmer material entropy production becomes very large, mostly because of the intensification of the hydrological cycle, while the generation of kinetic energy – and consequently its dissipation of kinetic energy – is pretty flat.” Therefore, a warmer climate is, in some sense, more irreversible. Non-equilibrium statistical mechanics is also playing a major role in the project. “The team is developing new tools for computing rigorously the response of the climate to perturbations, setting in a

unique framework concepts like climate sensitivity and climate response” says Professor Lucarini. “Using a suitable formalism, it is possible to reconstruct for a given spatial pattern of forcing the so called Green function of the climate observable we are considering. Such a Green function allows to compute explicitly the time-dependent response of the system to a forcing featuring a general time modulation. In other terms, we have a rigorous framework to predict -in a statistical sense - climate change. After preliminary positive tests on simple models, we are currently developing algorithms and to test these idea on fully coupled climate models”, says Professor Lucarini (see Fig. 2). As mentioned above, dealing effectively with the multiscale properties of the climate system is a fundamental step towards its understanding and has lots of pratical relevance for climate models development. “We have recently discovered that by using new, suitably defined tools of statistical mechanics it is possible to derive the parametrization describing the impact of fast processes occurring on unresolved spatial scales on larger scales of interest. We have found that the current state-of-the-art parametrization lack the ability of describing memory effects. The parametrizations can be constructed using Feynman-like diagrams.” (Fig. 3). Furthermore, the formalism of the response theory allows one to compute rigorously the effect of very general general stochastic perturbations on the system, and to use such perturbations as probes for detecting the system’s resonances.

Stability of the climate system Knowledge of these processes will help scientists refine and improve climate models, but it is also important in terms of understanding the structure of the climate system and assessing its response to perturbations. The idea of a tipping point – when the climate system slips from a marginally stable to a more stable condition – is a particularly prominent topic in climate research. “Given the external parameters of the climate system, such as solar radiation, orbital parameters and atmospheric composition, is our

45


At a glance Full Project Title Thermodynamics of the Climate System (NAMASTE) Project Objectives • Thermodynamic analysis of mechanisms relevant for climate variability and change; • Development of a new generation of diagnostic tools for analysing climate models and satellite data; • Radically improved understanding of tipping points, climate sensitivity, and climate response to forcings thorugh non-equilibrium statistical mechanics; • Memory effects, stochasticity, and extremes in geophysical fluid dynamical systems; • International activities of scientific training, dissemination, and networking. Project Funding European Research Council - Starting Investigator Grant (http://erc.europa.eu/) Project Details The research is conducted at the Institute of Meteorology, University of Hamburg, Germany and at the Department of Mathematics and Statistics, University of Reading, UK

climate in the only state it could be, or could it also be in other states?” asks Professor Lucarini. A lot of evidence suggests that our planet could support a different climate state under the present external parameters. “We have looked at the so-called transition from our current climate to a snowball climate and the other way round,” outlines Professor Lucarini. “Each time the system crosses the tipping point and moves into the other state, the efficiency decreases. After the transition, the system is less efficient (Fig. 4), which means it is closer to equilibrium.” While no firm conclusions have yet been drawn, the work could be extended further. “The idea is that we could have a global indicator, which describes the overall physical properties of the system, that could tell us about the change in the basic mathematical properties of the system at the transition,” says Professor Lucarini. This research could help inform environmental policy, and alongside

establishing links with various international bodies and research initiatives, starting with the IPCC, as well as favouring the interaction and collaboration of different scientific communities. Professor Lucarini is keen to help train researchers. “I am organising a two-month programme – to be held in 2013 at the Newton Institute of Mathematical Sciences in Cambridge – in which scientists from across the world will participate,” he says. “The goal of this programme is to create strong links between the core problems of geophysical fluid dynamics and climate science and the core problems of statistical mechanics, extreme value theory and partial differential equations. This should be a forum to inform colleagues involved in modelling and evaluation of climate models and to contribute to the improvement of models, as well as to inform colleagues in mathematical and physical research of the fascinating problems that geosciences is proposing to the wider scientific community.”

Contact Details Project Coordinator, Professor Valerio Lucarini T: +49 (0) 4042 838 9208 E: valerio.lucarini@zmaw.de W: http://www.mi.uni-hamburg.de/index. php?id=6870&L=3 The research leading to these results has received funding from the European Research Council under the European Community’s Seventh Framework Programme (FP7/20072013) / ERC Grant agreement No. 257106”

Dr Valerio Lucarini Project Coordinator

Professor Valerio Lucarini (b. Ancona, Italy, 1976) has attended undergraduate and postgraduate courses in physics and geosciences in Italy, USA, and Finland. He is currently Professor of Theoretical Meteorology (University of Hamburg, Germany) and Reader in Complex Systems Science (University of Reading, UK)

© NASA

46

EU Research


There is currently no common standard for optical measurements at eddy covariance flux tower sites. We spoke to project coordinators Drs Alasdair Mac Arthur, Enrico Tomelleri and Albert Porcar-Castell about EUROSPEC’s ongoing work in developing optical instrument specifications, techniques and protocols to standardise these measurements

Developing common measurement protocols Carbon dioxide (CO2)

and water vapour (H 2O) are key variables used in climate models to predict future atmospheric composition and climate change. Therefore, to improve model forecasts it is necessary to understand how the flux of CO2 and H 2O between the Earth’s surface and the atmosphere is regulated, a process largely controlled by photosynthesis. Atmospheric CO2 is taken up by plants during photosynthesis and a fraction of it stored as biomass, the rest being lost back into the atmosphere through respiration. Photosynthesis also results in oxygen and water being pumped into the atmosphere. Many

www.euresearcher.com

scientists use eddy covariance techniques to measure these fluxes with instruments mounted on towers at field sites around Europe and beyond. “In short, these measurements sample the instantaneous concentrations of CO2 and H 2O and wind (eddies) direction over an ecosystem to estimate its fluxes,” explains Dr. Albert Porcar-Castell, one of the EUROSPEC WG co-leads. The resulting data can be used to understand, model, and forecast the way our ecosystems interact with their environment and climate. But scientists also want to know what these fluxes and interactions are at the landscape and

regional levels in order to answer global questions. A promising new approach to this problem has been developed; the use of optical data (such as reflectance indices based on greenness) gathered both at flux towers and from satellites sensors to link ecosystem-level eddy covariance measurements with global coverage satellite data. “This provides a potential tool to understand the way different ecosystems interact with the environment, or to monitor the status of our ecosystems from space and with global coverage in near-real-time,” says Dr. Porcar-Castell.

47


With an expanding list of potential applications, there has been a dramatic increase in the use of optical measurements at flux towers. However, the techniques have evolved over time, with scientists typically using instruments they’re familiar with rather than following a common standard, an issue EUROSPEC is working to address. “The main aim of EUROSPEC is to develop common protocols and instrument specifications for use within a European network of eddy covariance flux tower sites. The instruments we’re interested in make optical measurements,” says Dr Alasdair Mac Arthur, who co-ordinates EUROSPEC’s instrumentation work together with Dr Porcar-Castell. The project’s primary focus is measuring sunlight as it is reflected from vegetation; currently two main approaches are used. “One is to use multi-band instruments that measure a few narrow bands of light, this approach has been used for a number of years; alternatively hyperspectral instruments measure many very narrow bands. As technology has advanced hyperspectral sensors have become more widely available, in fact there are plans to launch a hyperspectral sensor (EnMap) on a satellite platform in 2015. This will greatly enhance our ability to measure and monitor environmental change,” suggests Dr Mac Arthur.

Wavelength of light These optical measurement techniques allow researchers to gather data from regions of the spectrum through and beyond human visual perception. The average person can see light in a wavelength range of between approximately 400 to 700 nanometres, whereas the hyperspectral instruments being used in EUROSPEC can measure from 400 to 1,000 nanometres, and possibly even up to 2,500 nanometres. This will allow researchers to gather large amounts of data from which the health

48

and vigour of vegetation and its capacity to sequestrate atmospheric CO2 can be inferred. “Hyperspectral instruments measure light at around 1.5 nanometre intervals and in hundreds of overlapping bands providing a continuous spectrum, whereas multi-band instruments measure in 10 nanometre or even 50 nanometre wide bands. There are normally less than ten bands across the visible and near infra-red

Measuring light at eddy covariance flux towers of course depends on natural illumination. Scientists measure sunlight after it has interacted with, and been reflected by, vegetation or other elements of the Earth’s surface. “The instruments we use measure the reflected light and from that we can infer properties of the vegetation such as the photosynthetic activity, light use efficiency, pigments, water content and possibly biomass,” continues Dr Mac Arthur.

The main aim of the project is to develop common protocols and instrument specifications for use within a European network of eddy covariance flux towers. The instruments we’re interested in

make optical measurements spectral region, and there are often large gaps between the bands recorded,” says Dr Mac Arthur. “Overall, hyperspectral instruments provide scientists with much more information that can be used to draw more detailed inferences on plant physiological processes, leaf chemical content composition and/or canopy structure. From this information we can then estimate the dynamics of plant photosynthetic performance and model ecosystem carbon and water fluxes.”

Optical sensors The project is working to improve both the instrumentation and the optical techniques used to make these measurements. Dr Mac Arthur says there are several strands to this work. “In EUROSPEC we are investigating the performance of optical sensors, making recommendations for their improvement and are also trying to improve the way instruments are used in the field. While we have developed some sensors ourselves, in the main we utilise proprietary instrumentation, but use it in novel ways,” he says.

But the reflectance also varies with solar irradiance variation. Therefore, as irradiance changes over time, through cloud cover or changes in the position of the Sun for example, researchers need to use instruments that measure and record both the irradiance and radiance (or reflectance) simultaneously. Scientists within EUROSPEC have developed hyperspectral instruments which Dr Mac Arthur believes will lead to significant improvements. “This will help us understand reflectance off the vegetation and account for changes caused by changing levels of illumination,” he outlines. “In the ICOS initiative (Integrated Carbon Observation System) scientists plan to gather data from between 30 and 50 eddy covariance flux sites across Europe. From that data they aim to infer the carbon dioxide and water vapour fluxes and consequently the response of the ecosystems of those sites to climate change.” Scientists will use data acquired by optical sensors on airborne or satellite platforms to scale from local measurements at the flux towers to wider spatial areas,

EU Research


such as specific ecosystems or geographic regions. “We will try to link the optical data to the measured fluxes and vegetation biophysical parameters at the eddy covariance sites,” explains Dr Mac Arthur. “We can then try to understand synoptic data acquired by optical imaging sensors mounted on airborne and satellite platforms to infer fluxes, and the consequences of climate change, across larger areas.” Specialising in the instruments used to measure light, Dr Mac Arthur uses the data from those measurements to draw inferences on vegetation biophysical parameters using radiative transfer (RT) modelling. Hyperspectral measurements are an important part of this process. “This data and these RT models aid our understanding of the response of vegetation to climate change and its impact on ecosystems such as peatlands, which have been sequestering carbon from the atmosphere for millennia.”

Refining models By providing more accurate carbon flux estimates and continuously monitoring changes in our climate, scientists can refine their models, reduce uncertainties and improve the accuracy of their predictions. “We are trying to improve the measurement techniques and improve the quality of the data, which will be of use to climate scientists,” says Dr Mac Arthur. A standardised approach to this type of work will allow scientists to collaborate more effectively. “Work is ongoing to develop systems that can be used in the field to cross-compare our instruments and data,” outlines Dr Mac Arthur. “The COST Action EUROSPEC project has funded shortterm scientific missions, and PhD students have come to the FSF to try and understand the instruments we’re

using and how they’re calibrated.” This is as much a political challenge as a technical one, as nations are inclined to maintain their own standards. However, EUROSPEC scientists regularly collaborate with other researchers across the world to improve methods and understanding. “We need to make sure we’re all trying to measure the same things, taking a compatible approach and working to the same standards so that our results are directly comparable,” stresses Dr Mac Arthur. Researchers are keen to further improve instrumentation, measurement and analysis techniques, particularly the spatial and spectral scaling. “We need to be able to make hyperspectral measurements near to the ground, so from comparatively small sample areas, and scale these up to measurements made from satellite platforms by sensors with different spectral sampling intervals and samples from much larger surface areas,” says Dr Mac Arthur. “As many of the surfaces we are interested in are heterogeneous – particularly peatlands – understanding data sampled from multiple observation scales is challenging!”

Spatial representativeness Another EUROSPEC working group, co-ordinated by Dr Enrico Tomelleri of the Max-Planck-Institut für Biogeochemie, will investigate these scaling issues, a very active area of research. Drs Mac Arthur and Tomelleri and others believe the restricted spatial representativeness of current optical measurements at eddy covariance flux sites demands a new approach. “The major issue is how we relate optical measurements from samples at different spatial scales and how we aggregate them up to draw inferences across ecosystems and biomes. Research will go on,” he says.

At a glance Full Project Title Spectral Sampling Tools for Vegetation Biophysical Parameters and Flux Measurements in Europe (EUROSPEC) Project Objectives The main objective of the Action is to develop common protocols and new instruments within a larger European network for optical measurements, bringing together scientists and industries in order to increase the reliability, value and cost-efficiency of the existing spectral observations within the European flux network. The Action will focus entirely on the optical sampling strategies, which can be considered a fundamental tool in monitoring Biophysical Parameters (BP) and which act as a “bridge” between the flux tower and the remote sensing community. Contact Details Project Coordinators, Albert Porcar-Castell T: +358 (0)40 587 0444 E: joan.porcar@helsinki.fi Enrico Tomelleri T: +49 3641 576270 E: etomell@bgc-jena.mpg.de Alasdair Mac Arthur T: +44 (0) 131 650 5926 E: alasdair.macarthur@ed.ac.uk W: http://cost-es0903.fem-environment.eu/

Drs Albert Porcar-Castell, Enrico Tomelleri and Alasdair Mac Arthur

Project Coordinators

Albert Porcar-Castell is a Finnish Academy Postdoctoral Fellow at the Department of Forest Sciences of the University of Helsinki. He investigates the spatio-temporal link between optical data and the physiology of photosynthesis Enrico Tomelleri is a Postdoctoral Researcher in the Model Data Integration group at Max Planck Institut für Biogeochemie. His research foci are light use efficiency models, data assimilation and up-scaling methods. Alasdair Mac Arthur has a Ph.D. in field spectroscopy and radiative transfer modelling and is the Operations Manager of the NERC Field Spectroscopy Facility, Geosciences, University of Edinburgh. His research interests are the performance of field and imaging spectrometers, Earth observation optical measurement methodologies and data scaling issues in the spectral and spatial domains.

www.euresearcher.com

49


Better the Dust Devil You Know Improving the Representation of Mineral Dust Emissions in Earth System Models Professor Peter Knippertz, Professor of Meteorology at the University of Leeds and a leading expert in the analysis, modeling and forecasting of storm systems, talks to EU Research about Desert Storms, an ERC funded project which hopes to revolutionise the way mineral dust emissions are treated in numerical models of the Earth system Current models of

the Earth system struggle to reliably represent the effects that mineral dust from natural soils have upon weather and climate; dust influences many important elements of the Earth System such as radiation, cloud microphysics, atmospheric chemistry and the carbon cycle. Current estimates of dust emission and deposition are very uncertain, partly due to the lack of good observations; it is this uncertainty that the Desert Storms project hopes to address. Over the past 20 years, researchers have been developing models of the Earth system to incorporate more complex components, as computer technology has become more advanced. “In the early days, a climate model was primarily a model of the atmosphere,” Prof. Knippertz tells us. “But then this was coupled with ocean models, and soon more components were added, such as the cryosphere, and vegetation. Now we are trying to include atmospheric composition as well, including chemical and aerosol processes.” Natural soil dust is the most abundant aerosol species by mass, making it an important component of the Earth system. In recent years dust has been incorporated in a wide range of models. “The UK Met Office for example is now running dust in their operational weather forecasting models,” Prof. Knippertz says. “There are two main reasons for this. The first is that they want to warn of dust storms because of health implications and problems for air and road traffic, especially considering there are military operations being undertaken in areas where dust storms are a great concern. The second reason is that incorporating effects of dust into atmospheric models is hoped to improve weather forecasting.”

50

One of the biggest challenges facing the project is the significant discrepancy between computer models of dust production and observational studies of dust storms. From a climate perspective, estimates of global dust mass emissions can differ by a factor of four or five. “If you also consider weather forecast models which look for individual dust plumes in deserts for example, the uncertainty is even greater,” Prof. Knippertz tells us. “There are entire dust storms missed out by the models that we have. We don’t know enough about the characteristics of the desert soils, such as the binding energies between individual particles and their size distribution, but what we are arguing in this project is that there are also many problems on the meteorological side which have been neglected.” To achieve their overall objective, the Desert Storms team will utilize detailed observations from recent field campaigns into the Sahara desert and its fringes. “Field campaigns are always relatively short ventures due to logistics and cost, however you do get both high resolution and high quality data,” Prof. Knippertz tells us. As well as information from these campaigns, the team also uses data from a network of surface meteorological stations. “Most dust storm areas are uninhabited,” Prof. Knippertz says. “It is a challenge to use the sparse information available from these networks, for example from the Sahel, to learn more about dust emissions.” The Desert Storms team has also been using satellite information to bolster the time-limited data collected from field campaigns. “Of course,” Prof. Knippertz explains, “satellites have their own setbacks. Cloud cover can obscure dust

emissions and satellites typically don’t give you information about the vertical profile of dust concentrations.” To address this problem, spacebourne lidars can be used, however they offer a very limited space / time coverage. “It is a real puzzle to try and make the best use of the different data available,” Prof. Knippertz explains. “We do a lot of computer modeling to fill in the information gaps from our observations. However, even our models encounter problems in representing the complex meteorology over deserts. We want to learn from this where the models fail and use this information to suggest ways of how we can improve them.” Typically current Earth system and climate models operate on a horizontal grid with an average spacing of 100km or more. The drawback with this is that many dust storms tend to be much smaller, and so the models do not have the capacity to represent them in a realistic way. “We have to ask ourselves how can we take information from the grid scale and use it to learn in what way this determines what is happening on a sub-grid scale,” Prof. Knippertz explains. “If I have a 100km by 100km climate model gridbox, and I have information regarding the pressure, wind and temperature in this box, can I estimate statistically how many dust storms there may be and of which intensity?” This information can then be fed into Earth system models to create more accurate estimations of dust emissions from any given model grid point. One of the ways that the project will help reduce uncertainties is by investigating different types of dust storm; from individual dust devils up to continental storms there is a wide spectrum to be found. “Dust devils are the smallest and

EU Research


False-colour satellite image showing a huge dust cloud over the Sahara on 09 June 2010. Dust stands out in pink to purple colours in this image. The storm is connected to the outflow of cool air from the deep convective storms over the Sahel appearing in dark red on the image. (©EUMETSAT)

Automatic weather station from the IMPETUS (http://www.impetus.uni-koeln. de) network in southern Morocco. The lack of high-quality observations is a key limitation to our understanding of dust emission processes. (Photograph courtesy of Peter Knippertz)

Dust Devil. (Photograph courtesy of Vern Knowles)

require very high resolution models to study them. By comparison, the West African monsoon can produce thunderstorms that stretch several hundred kilometers and create dust emissions along their gust fronts,” Prof. Knippertz says. “By working on different types of dust emissions individually we will add another level of complexity to the study.” The data analysis and modeling by the Desert Storms team will be used to create a

series of parameters that will enable Earth system models to account not only for the different types of dust storm, but also many other processes that affect dust emissions. “For example, low-level jets are another process we are concentrating on,” Prof. Knippertz explains. “During the day, the desert surface gets so hot that it causes very deep turbulent mixing of the atmosphere.” At night, dry air and cloud-free conditions can cause strong radiative cooling of a

shallow surface layer, which becomes uncoupled from the atmosphere above it. This allows winds at higher altitudes to increase, and when the turbulent mixing begins again in the morning, these high winds are brought down to the surface and create dust emissions. “Looking at observations, there are many days when the only dust emissions are caused by these low-level jets, and this can be challenging for models.”

Satelite image of dust storms on the West coast of Africa ©NASA

www.euresearcher.com

51


At a glance Full Project Title Towards an Improved Representation of Meteorological Processes in Models of Mineral Dust Emission (Desert Storms) Objectives •C ompile an observational dataset of peak winds and dust emission. • I mprove physical understanding of peak wind generation. •A ssess relative importance of different processes for total dust emission. • E valuate these processes in numerical models. •D etermine model sensitivities with respect to resolution and model physics. • I dentify optimal model configurations and develop new parameterisations Project Funding Desert Storms is funded by the European Research Council (ERC). Contact Details Principal Investigator, Professor Peter Knippertz School of Earth & Environment University of Leeds Leeds, LS2 9JT, UK T: +44 (0)113 343 6422 E: p.knippertz@leeds.ac.uk W: http://www.see.leeds.ac.uk/research/ icas/working-groups/knippertz/projects/ desert-storms/

Professor Peter Knippertz Project Coordinator

Prof Peter Knippertz is a Professor in Meteorology and has extensive research expertise in analysing, modelling and forecasting storm systems in the tropics and extratropics. His current work is focussed on dust storms over the Sahara and windstorms over the North Atlantic/Europe using a wide range of observations (conventional, satellite, aircraft) and modelling tools. He has been involved in several international research programmes and field campaigns such as THORPEX (The Observing System Research and Predictability Experiment), AMMA (African Monsoon Multiscale Analysis), SAMUM (Saharan Mineral Dust Experiment), and DIAMET (DIAbatic influence on Mesoscale structures in ExTratropical storms).

52

The project team plans to publish a series of articles about the different dust emission processes and their representation in models. The longer-term aim of the study is to influence the way that Earth system modellers represent dust emissions. “The computer code we are developing will hopefully be taken up by operational centres such as the Met Office or by people running models in research mode,” Prof. Knippertz tells us. “These pieces of software can be incorporated into their own models to further their own investigations.” To assure that interested parties knew of the work and aims of Desert Storms, Prof. Knippertz gave a number of presentations at the outset of the project at various modeling centres, universities and at conferences. The importance of dust modelling on Earth system sciences is clear not only in terms of the weather, where the bigger storms are a hazard to aviation, road transport, and military operations, but also from a climate perspective. “Dust

Diurnal evolution of near-surface wind speeds generated by a high-resolution computer model. The white line shows the mean as typically predicted by a climate model; the colours indicate the fluctuations around this mean. The abrupt increase in mean wind speed and turbulence shortly after sunrise is related to the downward mixing of the nighttime low-level jet, which has been identified as an important dust emission mechanism. (Plot courtesy of Bernd Heinold) dust cloud over a bright desert will hardly affect solar radiation, but thermal radiation from the surface absorbed by dust particles will cause a warming effect similar to greenhouse gases.” Globally,

The computer code we are developing will hopefully be

taken up by operational centres such as the Met Office or by people running models in research mode

interacts with radiation,” Prof. Knippertz explains. “It can absorb solar radiation and reflect it; it also interacts with longwave radiation. This is dependent on size distribution. Smaller dust particles interact with solar radiation, while larger particles interact more with long-wave radiation.” The overall impact of these interactions can be complicated as dust can both warm and cool the atmosphere depending on the underlying surface. “A dust cloud over the dark ocean will scatter more solar radiation back into space, cooling the atmosphere,” Prof. Knippertz tells us. “However, the same

dust is viewed as having a generally cooling effect that might offset parts of global warming. It is therefore understandable that there is a lot of interest in this study from the climate community. There are many benefits to the wider scientific community from the Desert Storms project, and knock-on effects of the project’s findings could help researchers and scientists better understand dust processes: “If we can contribute to a better representation of dust in Earth system models, I think we will help a lot of other scientists to improve their own work on the impact of dust on radiation, clouds, the biosphere, the carbon cycle, and so on.”

EU Research


Modern analytical techniques allow researchers to identify changes in main plant polymers. These changes hold real importance to the production of speciality pulp, biofuels and biomaterials says Dr Jorge Colodette, the coordinator of the LIGNODECO project, an initiative aiming to develop new wood pre-treatment techniques

Getting the most out of Brazil’s crops The eucalyptus plant is a prominent feature of the Brazilian countryside, with a wide range of different types to be found across the country, including some developed specifically to achieve high productivity. These types are of great interest to the LIGNODECO consortium, which is looking at how the pre-treatment of both woody and non-woody Brazilian crops can be optimised. The project brings together academic and commercial partners from Brazil and across Europe in this work. There are four workpackages within LIGNODECO; selecting the raw materials, optimising pre-treatments, using advanced analytical tools for indepth characterisation of liquid and solid streams, and finally evaluating the various pre-treatments.

www.euresearcher.com

The project’s overall coordinator, Dr Jorge Colodette, is based at the Universidade Federal de Viçosa in Brazil. “We selected productive materials from the eucalyptus side and one type of grass, elephant grass,” he says. Researchers then pre-treated these raw materials with a view to certain applications, such as biofuels, biomaterials, and pulp and paper. “The project aims to develop analytical tools to characterise both the raw materials and the pre-treated materials. We want to achieve very good characterisation of the materials aimed at those products,” continues Dr Colodette. “A large part of the project is on this analytical side, to develop methodologies for precise measurement.”

Pre-treatment The raw materials are very well characterised chemically, but it is necessary to pre-treat them if they are to be useful for certain applications. Biofuel cannot be made directly out of the raw material for example, because it is usually too hard and most of the biochemical methods will not work. “You need to pretreat the raw material, to deconstruct it. Most biomass is made out of three polymers: you have lignin, cellulose and the so-called hemicelluloses. These are the three major polymers present; in order to use them you need to separate them from each other,” explains Dr Colodette. The polymers are roughly separated into three parts during pre-treatment, allowing the deconstruction of the original matrix; Dr

53


Colodette says the project is using three main types of pre-treatment. “There is the alkaline route, which includes many techniques, then the solvent route, which is basically ethanol plus a catalyst,” he explains. “The third route is enzymeassisted deconstruction, where we use mechanical treatment, after which we add some enzymes.” These techniques are designed to optimise the raw materials for various applications, including biogas, biofuels and special paper products. The alkaline treatment is very suitable for making paper, while it is also important to take the specific properties of the different types of eucalyptus into account. “Twenty two different types of eucalyptus were used in the project; one of the types is quite outstanding for biofuel for example, while others are better suited to pulp and paper. Our differentiation comes from the different pre-treatments for the different products,” says Dr Colodette. In a sense it could be possible to develop types of wood that are more suited to certain products, but this is very much a long-term aim, and the project’s focus is more on pre-treating wood and characterising the chemical changes in the lignin-carbohydrate matrix. “Certain eucalyptus woods contain large quantities of a special type of lignin – syringyl lignin – that is very good for making either paper or bioethanol. Lignin is the link that connects fibres in the biomass,” explains Dr Colodette. “This syringyl type of lignin is much easier to remove than others.”

54

The exact matrix of the wood can vary according to its age, but the project is focused more on using the available raw materials rather than looking at these types of issues. In Brazil trees are usually cut down at around seven years of age, an age at which researchers can pre-treat wood and modify its properties. “We aim to prove that proper raw material separation at the beginning is very important. We have proved that these various types of wood are very different from each other,” outlines Dr Colodette.

Elephant grass meanwhile grows astonishingly quickly. By the time it is 150 days old it has grown considerably and reached maturity, but for certain applications it can be harvested at 60 days old, when it is still immature. “Those two different ages of elephant grass had major differences in chemical composition. We came to the conclusion that for most applications the elephant grass is better at 150 days old,” continues Dr Colodette.

Most biomass is made out of three polymers: you

have lignin, cellulose and the so-called hemicelluloses. These

are the three major polymers present; in order to use them you need to separate them from each other “Although from the outset we didn’t really plan to study the wood age effect on biofuel and bio-products production, one of our partners did this kind of work on their own. They picked some wood at different ages and their reports suggest that the age of the wood has an effect. For example, if you want to make forestry aimed at biofuels, you should cut them in Brazil at one or two years of age, particularly in the Brazilian weather conditions.” Harvesting at a young age allows trees to be planted close together, with a very high production of biomass per planted area.

Brazil is one of the world’s emerging economic powerhouses, and Dr Colodette says the project is working together with a major paper company to explore the commercial potential of their research. “They want to make outstanding paper which differentiates them from their competitors. Another major issue is energy. Paper companies are big users of energy and usually have very big cogeneration plants,” he explains. A biorefinery could enable the company to deliver much more energy to the market from their co-generation plant. “Paper companies are interested in using lignin,

EU Research


At a glance Full Project Title Optimized pre-treatment of fast growing woody and nonwoody Brazilian crops by detailed characterisation of chemical changes produced in the lignincarbohydrate matrix (LIGNODECO). Project Objectives The primary objective of LIGNODECO is the development of optimised pre-treatment technologies for the deconstruction of highly productive, fast-growing Brazilian selected eucalypt clones and non-woody lignocellulosic feedstock. These materials will be used to produce special pulps, biofuels and biomaterials, using modern analytical techniques which enable in-depth identification of changes in the main plant polymers and other minor constituents.

an important polymer in the biomass. You cannot produce biofuel with it, but you can generate energy out of it,” says Dr Colodette. “Renewable energy is a major priority in Brazil. We have a very good renewable energy matrix in Brazil, around 46 per cent of our national energy output comes from renewable sources. A lot of biomass is being used for energy; the paper companies are big users of energy and they want to produce more. They are used to sell surplus energy to the national grid, but they could sell much more when operating the pulp mill as a biorefinery.”

Biopolymers This is just one of the many potential avenues that could be explored in future. The current focus however is on developing biopolymers, since producing only biofuels is not an economical approach to extract value from biomass. “Wood is a very expensive material, and with the current price of oil you can’t make a case for economical production of ethanol from wood. Not yet at least. But if we can get to a point where we can make biopolymers from it, and use the leftovers to make ethanol, then that could be profitable,” says Dr Colodette. The pre-treatment techniques have already led to two new paper products, which may be better than those that are currently on the market,

www.euresearcher.com

and Dr Colodette is keen to establish further patents out of the project’s research. “We have discovered that the elephant grass makes very good tissue paper – it could be used for that after these deconstruction procedures have been followed,” he outlines. “Using a solvent plus catalyst to deconstruct raw materials also results in a very good material for producing bio-ethanol. The result is a very well degraded material which is suitable for biofuel production. That work has also resulted in a patent, which was filed by the Finnish Institute VTT.” It is expected this work will help reduce Europe’s dependence on fossil raw materials in the production of chemicals and energy, which in turn will boost industry. The project’s findings will also have a significant impact on the traditional pulp and paper industry, helping it face the challenge of low-cost competitors through new cross-sector applications, which will transform their business model through the development of new, valueadded products. This is also very much in line with the EU’s wider agenda of encouraging more sustainable production processes. The stated goal is that 20 per cent of products and energy will come from renewable sources by 2020, and the work of the LIGNODECO project will help paper and pulp companies play their part.

Contact Details Project Coordinator, Jorge Luiz Colodette, Ph.D. Laboratório de Celulose e Papel Departamento de Engenharia Florestal, Universidade Federal de Viçosa, 36.570-000 - Viçosa, MG, Brazil T: +55 31 3899 2717 E: colodett@ufv.br W: www.lcp.ufv.br

Dr Jorge Colodette

Project Coordinator

Jorge Colodette earned his Ph.D in Paper Science and Engineering from the State University of New York in Syracuse, 1987. He was pulp and paper applications R&D manager for Praxair Inc., a visiting scientist at AbitibiPrice Inc. and a Visiting Professor at North Carolina State University. He is currently a Full Professor at the University of Viçosa, Brazil. He received the Tappi (Technical Association of the Pulp and Paper Industry) Fellow honorary title in 2007. Currently, he manages about 30 students and scientists.

55


56

EU Research


Rebuilding the Brain Bioelectronics – the concept of a genuine synergy between manmade Electronics and natural bodily systems – was once the stuff of science fiction. But, no longer. Since the advent of the first widely used implantable technology – the artificial pacemaker – researchers have been seeking true integration between electronics and the nervous system. As the field of neural bioelectronics booms, may brain implants even one day replace drugs?

I

n the 1970s, the first people were fitted with a cochlear implant. It was the first piece of electronic medical equipment to link directly with neurons in the brain; in other words, it was the first, long-awaited interface between human and machine. Still in use today by over 100,000 people around the world, this ‘bionic ear’ stimulates hearing neurons to create a sense of sound for even profoundly deaf people. It is a true interface and therefore, there’s a learning process involved. After it’s implanted, the patient is taught to understand and act upon the sounds they hear with extensive rehabilitation therapy. For many, it has been beneficially life-changing and nothing short of a miracle of modern science. Hearing parents have seen their congenitally deaf child’s eyes widen as it hears their voices for the first time. However, the technology hasn’t been without controversy and it continues to be a bone of contention between the Deaf community and medical providers as they debate whether deafness should be perceived as an illness, or if it should be regarded as a cultural identity. But while use of cochlear implants may continue to cause rifts in the medical profession, other serious, neurological conditions which cause mental and physical suffering continue to evade effective treatment. Society is facing increasingly severe population aging, leading to an increase in degenerative brain diseases: as a result, significant growth in neural implants is predicted. Could a bioelectronics device – or ‘neural prosthetic’ – implanted directly into the brain, be the future for treating such illnesses? Or will meddling with our little-understood brains always be controversial?

Successes with neural implants A type of neural implant referred to as deep brain stimulation (DBS) is quickly moving from clinical trials with a few patients

www.euresearcher.com

with Parkinson’s disease, dystonia and clinical depression to much wider adoption of the technology. Acting as a kind of ‘neural pacemaker’, these systems send electrical impulses to certain regions of the brain to control the unwanted physical and mental symptoms of such illnesses. Approved by the FDA to treat Parkinson’s since 2002, DBS is one surgical option for patients for whom drugs have not alleviated the symptoms of tremor and movement difficulties. Electrodes are sited in the thalamus, the subthalamic nucleus or the globus pallidus, and a pulse generator is also implanted in the chest. Although this approach does not treat the illness directly, the majority of symptoms are reduced, if not almost completely eliminated whilst the system is on. The risks of major surgery have so far limited the spread of DBS. The other risks of neural implants are still being investigated, but do these outweigh the potential risks and side effects of drugs? The now privately-held BrainGate system is another success of recent years, albeit one which remains in the experimental stages. This implantable system is designed for those who have lost control of their limbs, including those who have suffered spinal cord injury. In its current form, a small sensor sensitive to the firing of neurons is implanted in the region of the brain area which controls the lost movement. The sensor – laden with 96 electrodes – translates the excitation of neurons to an external device. Using this technology, patients have been able to effectively control a computer or a robotic arm simply by thinking about it. Trials are on-going and a paper published in Nature earlier this year outlined the ability of patients with tetraplegia to control robotic arms with the BrainGate. One female patient who had suffered a stroke in her brain stem was able to bring a cup of coffee to her lips – the most sophisticated movement yet achieved

57


effective and more discreet for their users. The first generation of bioelectronics saw the use of metal micro-wires, followed by the development of silicon arrays. With the advent of nanotechnology, scientists are investigating materials that may hold the key to making this happen.

External sound processor of a cochlear implant by a brain-computer interaction. Such genuine neural interfaces really are the stuff of science fiction. Another burgeoning area of bioelectronics is the ‘bionic eye’. Many distinct research groups are battling it out to create the first truly effective way of restoring sight. Patients in trials are often those diagnosed with the retinitis pigmentosa – a degenerative disease of the retina that eventually causes complete loss of sight. As it stands, a handful of patients have had surgical procedures to insert the interface. The Retina Implant AG product, for example, has been trialled in Germany and was this year implanted in the first two UK patients as part of on-going clinical trials. A small sensor fitted with 1500 electrodes is inserted beneath the retina converts light into electrical impulses, which it feeds to the optic nerve. So far, the best patients can expect is to be able to distinguish the outline of shapes and spot the differences between black and white. But even this is beginning to offer a return to ‘useable vision’ – an opportunity to regain control over their lives – and the scientists involved believe that as the brain learns to use the technology more optimally, vision with the interface will continue to improve for those using it. Exciting times are up ahead. If a method of restoring the sight of people with retinitis pigmentosa can be identified, there is hope for those with other forms of blindness.

New materials: into the nanosphere One issue with neural implants and electrodes is that the brain, over time, encapsulates the device with scar tissue. As the brain tissues move slightly, they rub against the implant and this worsens the build-up of scarring. As a result, proteins adsorb onto the surface and gradually, connectivity between the electrodes and the neurons is lost. Neural bioelectronics also need to be smaller, thinner, more like our own tissues in order to be more

58

Most implants are currently manufactured from platinum or iridium oxide but it’s hoped that third generation devices will use materials that are bioactive and flexible, with integrated electronics and surfaces which create a controlled biological response. In terms of engineering bioactive surfaces for neural bioelectronics, the overall aim is to reduce protein adsorption and increase the ‘right’ kind of cellular interactions with neurons and the device. Materials under investigation include synthetic plastics such as polyethylene glycol and polyvinyl alcohol or more natural substances such as polysaccharides and phospholipids. The idea is that accurately mimicking the extracellular matrix will prevent proteins attaching to the device’s surface. The physical and chemical properties of singlewalled carbon nanotubes (SWCNT) have made them a great candidate for improving the electrical responsiveness of neural devices. Research has already shown that SWCNT can themselves stimulate neuronal activity and that neuronal networks grow well on SWCNT mesh. Researchers are also exploring the options for surfacing implants with nanomaterials to prevent their rejection by the body as a foreign object. Nanoscale electro-conductive polymers are being tested as coatings on implants such as neural electrodes, in the hope the new surface will reduce rejection and improve the implant’s integration with the body. At just one atom thick, electrically conductive graphene is one avenue of research. It’s highly flexible, which should help reduce the formation of scar tissue, although conversely, this flexibility makes it difficult to insert in the correct orientation during surgery. The beneficial properties of graphene, however, mean that researchers are developing non-toxic, biodegradable ‘backbones’ that hold it stiff during insertion but which rapidly break down inside the body. If loaded with anti-inflammatory drugs, this could also aid the acceptance of the implant. This nanomaterial is also resistant to harsh bodily environments and unlike silicon, has no need to be stabilised by a metal oxide coating. This in turn makes graphene more sensitive to cellular activity, which if used in neural implants, could improve the responsiveness of the bioelectronics product. But while the efficient manufacture of graphene sheets remains out of reach, ultra-thin silicon may continue to make headway. At the moment, the insertion of neural bioelectronics is a ‘hit and miss’ game. For some patients suffering from depression, for example, deep brain stimulation reduces symptoms while for others, it has no effect. But the potential is clearly there, and with more development, the ultimate goal would be to harness neural plasticity to our advantage and to use bioelectronics devices to actively characterise, predict and control the neural interface. Modelling and characterizing the cellular and chemical responses around neural implants will be vital in selecting which materials to move forwards with.

Powering the future Power is also an issue for implantable devices. How could such devices be powered in the long-term without the need for further surgery? In many devices currently in trials, power for the implant comes from a built-in battery or is received wirelessly from an external energy pack. Such devices must currently operate at low

EU Research


energy levels because the surrounding tissue is often very sensitive to changes in temperature. Devices with on-board batteries may require surgery to replace worn-out cells, but repeated invasive procedures are, as always, best avoided. Next generation neural implants may be created with integrated power sources that provide a continuous, self-sustaining energy source without increasing the device’s size. One concept is to produce energy using biologically-inspired sources, such as using microbial fuel cells, synthetic photosynthesizing cells, or perhaps by capturing kinetic energy from the natural movements of the body. The challenge will be to do this, and improve battery energy density without increasing tissue damage. One research group at MIT has created a fuel cell powered by the glucose in the body. The cell will effectively gather and break down glucose to create energy in much the same way as the body’s cells do. While similar systems have been created before, this fuel cell uses a solid-state platinum catalyst to produce power and importantly, doesn’t spark an immune response when inserted inside the body. Such a device could be put to great use in neural bioelectronics: The cerebrospinal fluid which surrounds the brain and fills its cavities is rich in glucose to fuel the powerhungry neurons.

A drug-free future? So, bioelectronic implants are already showing potential for treating neural degenerative diseases such as Parkinson’s, as well as restoring some degree of motor function and vision. But might it be that not only such diseases, but other, systemic illnesses, be managed with implants, too? The focus on Western medicine remains drug-based treatment, but could the refinement and optimization of neural bioelectronics signal a paradigm shift? Cells functioning abnormally could be brought ‘back into line’ not with chemicals, but with intelligent electrical signals. Neuronal disturbances and illnesses, from loss of mobility to epilepsy and even clinical depression, could potentially be tackled in this way. It’s not just an idea – serious money is behind making it a reality: a report published in the Financial Times this year revealed that pharmaceutical giant GlaxoSmithKline is investing more resources than ever in bioelectronics. So perhaps the question is; will society find neural implants any more sinister than the prescription of drugs with potential side effects? Would treating the symptoms of disease with electrical signals be any more disturbing than submitting to the swallowing of a powerful pill each day? It seems that whatever we decide, efficient, effective neural bioelectronic devices are on their way.

Implantable retinal stimulator manufactured by IMI Intelligent Medical Implants Credit: Ars Electronica

www.euresearcher.com

59


Taking novel materials from concept to application The extraordinary electronic properties of graphene have generated great interest in both the academic and commercial sectors, with scientists predicting the material could be used in a wide range of technological products. However, many challenges remain before graphene can be widely applied, says Professor Oleg Yazyev of the EPFL in Lausanne, Switzerland The development of

graphene, a twodimensional material formed by a single layer of carbon atoms, has generated great interest in both the academic and commercial sectors, with scientists predicting that it could in future be applied in everything from consumer electronics to solar panels. However, many outstanding problems must be solved before these predictions can be fulfilled, an area which forms the primary focus of Professor Oleg Yazyev’s research. “The global objective of my research is to solve problems on the way from the basic physical properties of novel electronic materials to their technological applications,” he says. Based at the École de Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, Professor Yazyev’s research group is studying the novel physical properties of graphene by means of theory and computations. “Charge-carrier mobilities in graphene are exceptionally high, which makes it very interesting for making highspeed electronic devices,” he explains. “However, graphene is very different from the materials which are currently used in electronics. It presents us with a large number of new problems which we still have to solve before we see the first graphene computer.”

Understanding wafer-scale graphene Among the most important problems is the difficulty of controlling charge transport. Today graphene can be produced at both wafer scales and beyond, but scaled-up samples do have a different structure to micrometre flakes, which are often singlecrystalline in structure. “At larger scales graphene is unavoidably polycrystalline, meaning it’s composed of single-crystalline

60

domains with different relative orientations of the crystalline lattice,” explains Professor Yazyev. Topological defects like grain boundaries, line defects which separate two-dimensional single-crystalline domains, are intrinsic to polycrystalline materials and have a significant impact on the electronic properties of graphene. “Studying polycrystalline graphene and relevant topological defects is an important project in my group. I’m particularly interested in understanding how these defects affect the electronic transport properties of graphene,” says Professor Yazyev. “We are also trying to propose new concepts for using these defects to control electronic transport in graphene, as functional components of nanoelectronic devices. This research is directly relevant to the potential technological applications of graphene, especially in electronics.” During transport charge carriers pass through a grain boundary with a certain probability. This transmission probability varies according to several factors. “It would depend on the energy of the charge carrier, on the direction it travels in respect to the grain boundary, and of course on the characteristics of the grain boundary itself. These include the relative orientation of the two single-crystalline domains and the local atomic structure of the grain boundary. There might also be some other interesting phenomena; for example, we have predicted theoretically that grain boundaries of certain structural types completely reflect charge carriers in certain energy ranges,” says Professor Yazyev. Such defects could in principle be used as device components, utilising properties not present in single crystalline graphene;

Professor Yazyev says this is a technically challenging area. “If we want to use these defects for a purpose, as device components, we need to learn how to engineer them,” he explains. “This is a very challenging task – currently there is no method which really demonstrates a sufficient degree of control over the structure of these defects. Developing such a method is one of the goals of my research.” This hints at the wider commercial potential of graphene, and also the scope for further research into the material and its possible applications. With such a wide range of research possibilities, Professor Yazyev doesn’t want to limit himself to looking at just polycrystalline graphene. “There are some other new ideas which may find use in technology, such as manufacturing onedimensional graphene nanostructures,” he says. Professor Yazyev has clear plans for the future. “From understanding electronic transport in polycrystalline graphene I would like to move on to the transport properties of self-assembled graphene nanostructures produced in a bottom-up chemical approach, which could be a highly effective method for making functional nanodevices at large scales,” he outlines. While his research is mainly theoretical in nature, Professor Yazyev maintains close relationships with experimental colleagues and is keen to work in areas that could be relevant to the potential commercial applications of graphene. “We are actively

EU Research


using the results obtained by our experimental colleagues. We collaborate with them in helping to understand their results, and, most importantly, we are trying to motivate them to realize our theoretical predictions in practice,” he outlines. “We can accurately predict how various types of structural modifications change the properties of graphene. This includes specific electronic, magnetic,

which makes use of electron spins to store and operate information. “The idea is to combine the extraordinary electronic properties of graphene, which is what makes it exciting in terms of pure electronics, with magnetic ordering to produce an ultimate platform for spintronic devices,” explains Professor Yazyev. “We would like to have magnetic ordering at least at room temperature, the regime in

Charge-carrier mobilities in graphene are exceptionally high, which makes it very interesting for making high-speed electronic devices. However, graphene is very different from the materials which are currently used in electronics. It presents us with a number of new problems which we still have to overcome before we see the first graphene computer transport, mechanical and optical properties. And by combining our efforts with experimentalists we aim to introduce graphene into real technological applications sooner.”

Beyond traditional electronics Alongside its focus on applying novel materials in traditional electronics, the group is also working on alternative technologies. One such area is spintronics, an extension of electronics

which practical spintronic devices are supposed to operate.” The group is also pursuing research into other Dirac fermion materials; prominent among these are topological insulators, which display Dirac fermion surface states similar in many ways to those in graphene. However, these materials present a number of new physical properties, such as the absence of spin degeneracy and the helical spin texture of the Dirac fermion surface states. “Topological insulators realise the intrinsic electronic topological order. This is a whole new class of materials found only very recently. I would compare the discovery of topological insulators to the discovery of superconductivity. This is basically of the same calibre,” says Professor Yazyev. “I’m actively extending the scope of my research interests into the field of topological insulators as the field grows, in particular I would like to focus on the inter-disciplinary aspects. Most of the research in this area is currently being done by the physics community, but there are also many exciting problems related to materials science, chemistry and electrical engineering.”

At a glance Full Project Title Dirac fermion materials: from fundamental science to applications through computation Project Objectives The projects aims at understanding and designing novel materials, functional nanostructures and devices in silico, that is, by means of computer simulations involving various level of computational complexity. The ultimate goal is to build a continuous bridge between fundamental science and technological applications based on numerical experiments. Contact Details Project Coordinator, Professor Oleg Yazyev Ecole Polytechnique Fédérale de Lausanne (EPFL) CH-1015 Lausanne Switzerland T: (+41 21 69) 35485, 33415 E: oleg.yazyev@epfl.ch W: http://www.epfl.ch W: http://gr-yaz.epfl.ch This project is supported by the Professeur Boursier grant of the Swiss National Science Foundation (Grant No. PP00P2_133552)

Professor Oleg Yazyev

Project Coordinator

Professor Oleg Yazyev gained his degree in chemistry from Moscow State University in 2003 then joined Ecole Polytechnique Fédérale de Lausanne (EPFL), completing his PhD thesis in chemistry and chemical engineering in 2007. His current research focuses on the theoretical and computational physics of the recently discovered Dirac fermion materials with a strong emphasis on their prospective technological applications.

Artistic image of the atomic structure of a grain boundary in graphene

www.euresearcher.com

61


Dynamic light scattering infrastructure

Nanoparticles are an important part of many consumer products, yet little is known about their effects on the environment. We spoke to Andrew Nelson, Dave Arnold and Karen Steenson of the ENNSATOX project about their research into the structure and function of nanoparticles, and how this could lead in to regulation of the nanoparticle industry Engineered nanoparticles are used in a growing list of modern consumer products, including cosmetics, pharmaceuticals and various types of sensors, yet the nature and extent of the risk they pose to the environment is still not fully understood. While keen to encourage the development of the nanoparticle industry, regulatory bodies must also be mindful of its likely impact on the environment, as using new products without understanding their likely impact on environmental systems can have serious consequences. The use of DDT to boost crop production is just one example of a new product which was widely used for a time then later found to be harmful to the environment, and the European Commission is keen to ensure this kind of mistake isn’t repeated. Developing reliable tests by which the 62

toxicity of nanoparticles can be assessed is a crucial first step. We spoke to Andrew Nelson, Dave Arnold and Karen Steenson of the EC-funded ENNSATOX project about their work in investigating the structure and function of nanoparticles and its importance in terms of regulating the industry.

Structure and function of nanoparticles EU Researcher: What are the main areas being studied in ENNSATOX? Professor Andrew Nelson: The hypothesis of ENNSATOX is that the biological activity and environmental impact of nanoparticles is directly related to their structure and functionality. By evaluating these relationships we can then develop predictive models which

can be deployed for statutory controls of nanoparticle use.

EUR: Are you focusing on any particular types of nanoparticles? AN: We’re concentrating on three types of nanoparticles; silicon dioxide (SiO2), zinc oxide (Zn0) and titanium oxide (TiO2). We’ve found that silicon dioxide, which is similar to ground-up glass as it’s amorphous, is fairly stable. So it remains a small particle throughout its lifecycle and is very membrane active. With zinc oxide we also found a particle size effect – for very fine particles they’re highly active. The problem with zinc oxide is that it also dissolves to form zinc iron – so you have to distinguish between the activity of the particles and the activity of the soluble zinc oxide. We’ve found that the zinc oxide particles themselves are highly membrane active.

EU Research


EUR: What impact does this have on the toxicity of the particle?

nanoparticle to cells you ask; does it kill the cell? Does it affect its growth? DA: Toxicity increases with the concentration of the nanoparticle, probably to an extent where the cell can’t recover. You’ve also got different kinds of responses from the organism – acute responses and chronic responses, which might ultimately affect the organism’s ability to reproduce.

AN: Toxicity has many elements. One element is the interaction with the biological molecules; in ENNSATOX we’re looking at interactions with lipids, proteins and DNA. Another is the overall effect on an organism, and we’ve found that the silica particles have a very high membrane activity, which is unusual.

EUR: How do you approach characterising a nanoparticle? AN: Particles have a whole suite of characterisation. We use a multitude of microscope and imaging techniques and light scattering to measure particle size and chemistry, and look at how the particles behave in solution or how they disperse. We also measure aspects of behaviour – whether the particles aggregate quickly, whether they are stable, and so on. Dave Arnold: The ENNSATOX consortium is strictly controlled in terms of what researchers use, so everybody uses a highly characterised sample. Everybody is absolutely certain that their samples have the same configuration; it’s a highly rigorous process.

EUR: Is there a relationship between the structure and function of a nanoparticle and the results of the characterisation? AN: Our most important finding to date is related to particle size. We’ve found that when the particle size is large it has little effect, but when it’s small there is a much higher effect, meaning greater toxicity. DA: When people talk about nanoparticles they can be talking about a wide range of sizes. AN: A nanoparticle is between a tenth of a nanometre and 100 nanometres so the range is between the size of an atom and a tenth - the size of a living cell!

EUR: Are the properties of nanoparticles partly due to the fact they are so small? AN: These particles have unusual electronic properties and that gives them other characteristics. When they get to a very small size they have very interesting and unusual properties. For instance zinc oxide is actually a semi-conductor, but only at very small sizes. Zinc oxide showed very interesting electro-chemical behaviour; this was

www.euresearcher.com

A test species (Tisbe) separate from the lipid-zinc oxide interaction, but it could be a factor in it. The zinc oxide particles behaved as semiconductors, you could actually see electrons jumping onto these materials.

EUR: Are you worried that these effects aren’t fully understood by the people using these materials in consumer products? AN: Absolutely. DA: The original work showed that the size of the particle related to penetration through cellular membranes. With electron microscopy and time-lapse photography you can see how these particles get into cellular membranes – it’s quite dramatic.

EUR: You work with biological models to try and understand these processes? AN: Yes. We start off with membrane models, which go up in complexity from a simple layer of phospholipids on an electrode to several layers of phospholipids, then bi-layer membranes on electrodes and vesicles on electrodes. We look at the effects of the particles on the structure and composition of these membrane models.

EUR: How do you assess whether they have toxic effects? AN: Toxicity means altering the performance of an organism from its normal state. Because it’s very inconvenient to carry out experiments with living organisms, we have started to do in vitro experiments, where we look at the interaction of these particles with different parts of an organism. This could be cells, DNA, genetic material, or biological membranes, which have always been very important in toxicology. A biological membrane is the first interface between a cell and its interior. When you’re looking at toxicity of a

EUR: Can you identify both temporary and permanent toxic effects? DA: We can identify reversible effects, yes. One way of showing that is by taking an organism out of exposure to a toxin and into for aquatic organisms, for example fresh water, and seeing whether it recovers from exposure to a given dose and for how long. In classical toxicology you can do bioaccumulation studies by feeding tolerable doses of chemicals to fish and then looking at their excretion rates and seeing whether they excrete it or retain it. As humans we are able to tolerate quite low but constant concentrations of many persistent substances in our bodies.

EUR: Do you plan to develop a database of this kind of data? DA: We do have a database of toxicity data being generated within the ENNSATOX project but we are also developing a database on ‘expertise’. We’re creating a database of experts across various specialist areas. This is not just for ENNSATOX but for the cluster, the overarching group of people doing other projects in similar areas.

Ecological studies EUR: What is the role of ENNSATOX within the cluster? Dr Karen Steenson: The important thing about this project is that it’s on the ecological side of things. Much of the testing of nanomaterials and nanoparticles has come out of human studies. The ecological side of things has been relatively neglected by comparison.

EUR: This is an imbalance you’re working to redress? KS: Yes, we’re looking at testing systems of increasing complexity, from cellular assays using mammalian, freshwater and marine cells; tissue level assays using squid giant axons and synapses;

63


At a glance Full Project Title Engineered Nanoparticle Impact on Aquatic Environments: Structure, Activity and Toxicology Project Objectives • To source and comprehensively characterise a representative group of nanoparticles: initially ZnO and later SiO2 and TiO2 and other metal oxides of varying morphology and dimension. • To characterise the interaction of the nanoparticles with the following biological models: supported phospholipid membranes of increasing complexity, in vitro models of cell and tissue culture, in vivo models of several different species of key indicator organisms. • To formulate direct and predictive structureactivity relationships between nanoparticle form and nanoparticle biological activity. • To analyse the behaviour and fate of nanoparticles and their impact on models of biota in environmental aquatic systems. • To configure a mathematical model for the behaviour of nanoparticles in aquatic environments taking account of their interactions with biota of increasing complexity. • To draw up standard procedures for the exploitation and dissemination of the results for statutory planning and accredited use. Contact Details Scientific Coordinator, Professor Andrew Nelson Centre for Molecular Nanoscience (CMNS), School of Chemistry, University of Leeds, UK T: +44 113 343 6409 E: andrewn@chem.leeds.ac.uk W: www.ennsatox.eu Project Manager Dr Karen Steenson, Faculty of Engineering, University of Leeds, UK

Professor Andrew Nelson

Project Coordinator

developmental assays using ascidians; through to unicellular organisms such as fresh- and sea-water algae; right through to complex multicellular organisms such as invertebrates (e.g. Daphnia) and fish (e.g. Zebra fish). We’ve also looked at species sensitivity and exposing organisms such as fish acutely (single dose) and chronically (giving them repeat doses) AN: When you look at model membranes you can get very exact, precise answers. However, when you go on to a singlecelled organism like algae, it’s not quite so precise. As organisms grow more complex it becomes harder to get precise values.

The question is; are the current toxicity tests robust or sensitive enough to tell you about the toxicity of nanoparticles?

EUR: What do you hope will be the outcomes of your project? KS: The European Commission has invested a lot of money in this area, and the projects they’ve sponsored have come together in the form of a NanopSafety Cluster (http:// www.nanosafetycluster.eu/), and ENNSATOX is one of the participating projects. The Cluster is a Commission, DG RTD NMP initiative to get FP6 and FP7 projects to work together to maximise their synergies in order to address all aspects of the safety of nanomaterials including

The hypothesis of ENNSATOX is that the biological activity and environmental impact of nanoparticles is directly related to their structure and functionality. By evaluating these relationships we can then develop predictive models which can be deployed for statutory controls of nanoparticle use EUR: Have you established any links with Government or environmental bodies? DA: We held a meeting in London where we brought industry, researchers and Government together. We had a very enthusiastic response from DEFRA, who I think see this project as crucial to giving them a handle on eco-toxicity and pointers as to how best to regulate the nano industry. The OECD is developing test methods as nano-regulatory tools, which are mostly based upon standard toxicity tests. Nanomaterials – including nanoparticles – will in future be regulated by the European Chemicals Agency (ECHA).

toxicology, ecotoxicology, exposure assessment, mechanisms of interaction, risk assessment and standardisation. I would envisage the outcome of the work being done in ENNSATOX and other nanosafety projects will be some form of tier-testing methodology, so you’d go from simple systems – your cell membrane and cellular organism-based systems – all the way through to more complex organisms. When a company develops a material it would go through this testing regime, and you could predict its toxic effects. DA: DEFRA don’t want to hamper business in the UK by over-regulating, but it’s also important to be certain that nanoparticle use is environmentally acceptable.

Dynamic light scattering infrastructure

Andrew Nelson obtained his PhD in Estuarine Chemistry in 1975 and carried out a postdoc at University of Edinburgh in 1975-6 examining the physics and chemistry of particles in the ocean. From 1976-81 he worked as an Analytical Chemist with Thames Water Authority. From 1981-2001 he was based at Plymouth Marine Laboratory and during that time pioneered the area of biosensors, in particular membrane-based sensors. Since 2001 he has worked as an Academic at the University of Leeds. In 2009 he won the prestigious Royal Society Brian Mercer Award for his longstanding pioneering work on environmental sensing.

64

EU Research


Mexico’s Center for Sustainable Development: Promoting Green Growth in Latin America The issue of global climate change is one that affects us all; worldwide, nations are coming up with new ways to combat the effects of global warming. EU Research’s Richard Davey investigates the Centre for Sustainable Development, Mexico’s answer to the question of global climate change

A

s the consequences of global climate change upon Earth become ever more apparent, methods of combating these effects are becoming increasingly imperative, and one country in particular is leading the wave for a greener future. Mexico, led by its president Felipe Calderón, has taken these global concerns on board and this has led to the creation of the Desarrollo Sustentable, Asociación Civil or by its English translation, the Centre for Sustainable Development. On the 23rd March 2012, Mexico’s Centre for Sustainable Development was officially launched. Financed by the Mexican government and also sponsored by the United Nations Environment Program (UNEP) and Climate Works Foundation, the Centre is a public-private non-profit organisation created with the intention of

www.euresearcher.com

actively promoting a green economy for the country as well as developing policy and supporting activities that tackle the issue of global climate change. The Centre is designed to be a Think Tank that will attract specialists and the finest minds on environmental issues not only from Mexico, but also from around the world. It aims to become a hub of South-South co-operation, promoting the open exchange of resources, technology, and knowledge, between the developing countries of the global South. It is hoped that the Centre will become a forum that encourages government agencies, private enterprises, and civil society to engage in open and constructive dialogue. The Centre will provide them with a platform for their own thoughts and ideas in relation to the fundamental issues that lay at the heart of the institution. As an independent and transparent

65


institution, the Centre will serve as a model not only for other regions in the area, but also for other countries around the world. The Centre for Sustainable Development boasts an international Board of Directors whose combined knowledge and experience in the field of environmental issues take Mexico to the forefront in the fight against climate change. Dr. Mario Molina is perhaps one of Mexico’s most famous environmental scientists; he serves on the US President’s Committee of Advisors in Science and Technology and won the Nobel Prize in Chemistry in 1995. It was Dr. Molina’s pioneering work in the 1970’s that first alerted the world to the threat that CFC gases posed to the Earth’s ozone layer. Another board member is Dr. Rajendra Pachauri, who also has had a long and distinguished career focussed on global climate and environmental issues. He was made chairman of the Intergovernmental Panel on Climate Change (IPCC) in 2002 and is the Head of Yale’s Climate and Energy Institute. He has also been awarded the Nobel Peace Prize, which he received in 2007 for his work with the IPCC.

poverty gap within the country. During the Centre’s inauguration ceremony earlier this year, President Calderón commented that the COP16 was the source of some of the greatest achievements in international negotiations on environmental issues. Mexico had pledged to promote a Regional Centre of Research on Sustainable Development and Climate Change, and it was this pledge that became the Centre for Sustainable Development.

The Centre is designed to be a Think Tank that will attract specialists and the finest minds on environmental issues not only from Mexico, but also from around the world

The board also includes Mr. Achim Steiner, the UN Under-Secretary and Executive Director of United Nations Environment Programme, Mr. Martin Lidegard, Minister of Energy and Climate Change for the Government of Denmark, and Mr. Joseph Ryan, Latin America Vice-President of the Climate Works Foundation.

Origins of the Centre for Sustainable Development Initially announced in December 2010 at the 16th Conference of the Parties (COP16) at the Cancun Climate Change Conference, the Centre for Sustainable Development’s mission is to promote viable, innovative public policies that will address economic development and sustainability while at the same time working to redress the

66

In 2010, Mexico was ranked as the 13th largest economy in the world, with a GDP of approximately US$ 1 trillion, however around 40% of Mexico’s population still lives in poverty. The genesis of the Centre lies in the desire to help reduce poverty and stimulate economic growth in the region, but these goals will be much harder to achieve without first reducing greenhouse gas emissions and seeking more renewable sources of energy. Furthermore, Mexico is one of the 18 megadiverse countries of the world; it is home to some 200,000 different species and accounts for between 1012% of the world’s biodiversity. The Mexican government’s responsibility to protect the country’s biodiversity from the threat of global climate change adds to the importance of the Centre for Sustainable Development and the work that it will undertake.

The work carried out at the centre will focus on the research of methods of best practice for social development, green growth, mitigation and adaptation action, and also utilising financial models that will allow private finance to be unlocked for the promotion of green growth. Over the next three years there are six core studies that the centre will be working on in order to achieve its goals. These studies are:

EU Research


• Mexico’s Green Growth Plan • Natural Resources Productivity: Mexico’s Case • Low Emissions Transport in Mexico and Latin America • Methodology and Guidelines for Climate Change Action and Low-Carbon Growth Programs at National and State Levels • Technology and Green Growth: Matching Regional Technology Demand with International Technology Supply • Water: Challenges and Opportunities On top of this, the Centre is working with the World Economic Forum (WEF) and the Green Growth Action Alliance (G2A2) to develop a financing framework to encourage investment from the private sector into Mexico’s green growth infrastructure. So far, around 50 companies and organisations are involved in this project; in October 2012 the Centre will host a workshop in Mexico City for this alliance. While a lot of the work carried out by the Centre for Sustainable Development will focus on encouraging policy changes at a national level that will foster a greener economy, it also undertakes applied research. This research is aimed at indentifying opportunities within various business sectors that would ultimately benefit Mexico but would also serve as a means of highlighting possible barriers that would prevent the development of such sectors. Working again with the WEF and G2A2, the Centre is preparing to launch a series of pilot schemes that will address a range of issues integral to the underlying ethos of the Centre. The pilot schemes will be focussing on renewable energy, energy efficiency, public transport, and waste management. One of the schemes already underway within Mexico is Trade in Your Old One for a New One. This program encourages the public to trade in their old, energy consuming, household appliances for

newer more energy efficient models. When talking about this programme, President Calderón commented that Grandma might love her old fridge, which has lasted her for years, but it uses up much more energy than today’s modern refrigerators do. Older appliances do more harm than good in the long run, and a government approved scheme that allows access to newer more energy efficient appliances for all is certainly a step in the right direction in terms of creating not only a greener economy but also a greener planet.

The Centre for Sustainable Development in Context At the G20 Leaders’ Summit in June 2012, President Calderón said how at the start of his government, he had pledged to promote a responsible, active foreign policy that would trigger national development. He went on to say that: “Mexico assumed a clear leadership of one of the most pressing, important challenges for humanity, namely the challenge of climate change. [COP16 brought] together leaders and specialists from all over the world, where major commitments were reached to halt environmental deterioration for the benefit of present and future generations.” Mexico’s commitment towards establishing itself as a global leader when it comes to the sustainable development of the country, and also the promotion of a Green Economy, is all the more apparent with the establishment of the Centre for Sustainable Development, but this is just one facet of the overall picture. As reported by UNEP in their March-April 2012 Newsletter, Mexico’s Senate recently approved the Climate Change General Law, which sets out guidelines on how to reduce Green House Gas Emissions and also includes public policies towards the sustainable development of the country. The law will encourage climate change

Mexico assumed a clear leadership of one of the most pressing, important challenges for humanity, namely the challenge of climate change.” – President Calderón www.euresearcher.com

67


The centre is a viable alternative to build and work in Green Growth economic models; we believe it embodies an opportunity for more efficient use of vast natural resources in the region while at the same time reducing the vulnerability to climate change.” – Irma Gomez, Director for the Centre research and support the efforts made by public institutions to curb it. UNEP also reported that the National Chamber of Transformation Industries, which represents close to 50,000 entrepreneurs, has officially launched a Green Economy Sector. Furthermore, the Mexican government has also adopted a Special Climate Change Programme, providing ambitious short-term targets aimed at reducing emissions of greenhouse gases.

In Conclusion During the centre’s inauguration ceremony, President Calderón commented that humanity faced a double-edged challenge; that of bridging the gap between man and nature and rich and poor. The centre addresses this challenge. By working in conjunction with the private sector, non-governmental organizations, academia, and government agencies, the centre will set a Green Growth Plan for

68

Mexico that will not only promote a greener economy, but will also help create more jobs throughout the country by unlocking private financing to increase green investment. Irma Gomez, the centre’s Director told EU Research that: “The centre is a viable alternative to build and work in Green Growth economic models; we believe it embodies an opportunity for more efficient use of vast natural resources in the region while at the same time reducing the vulnerability to climate change.” Global climate change is perhaps the greatest challenge faced by contemporary society; worldwide, countries are finding ways to address this challenge and reduce the negative impact that we have had on the Earth’s ecosystem. Mexico’s Centre for Sustainable Development highlights the importance of promoting Green Growth and should serve as an example to other nations of the steps that we should all be taking in order to secure our children’s futures.

EU Research


Bridging the maths-physics divide FIGURE 1. Some Feynman diagrams depicting some possible interactions of light with matter. The wavy lines represent photons; the straight lines, electrons or positrons.

FIGURE 2. These schematic diagrams represent idealised interactions between particles in Quantum Field Theory. To each one is associated a kind of probability or amplitude which involves zeta or multiple zeta values.

Physics and mathematics have grown increasingly specialised over recent years, making it much more difficult for researchers to share knowledge and expertise, despite the fact that their work is often closely related. Exchanging ideas and techniques will benefit researchers in both disciplines, says Dr Francis Brown of the PAGAP project Once considered part of the same area

Periods in mathematics

of study, mathematics and physics have diverged considerably over the last few hundred years. The two fields have become so specialised that it is increasingly difficult to share ideas and techniques, an issue the PAGAP project is working to address. “We aim to develop fundamental mathematical ideas which are inspired by questions in particle physics,” says Dr Francis Brown, the project’s coordinator. “We are also working to develop practical tools that are useful for physicists, by drawing on recent developments in mathematics.”

The study of periods, which form a certain class of numbers, is a prime example of a domain which was originally inspired by 18th century physics. It was developed independently by mathematicians, and has recently returned to the fore in modern particle physics. The basic example of a period is the number pi( π ), which has been known since antiquity. Other examples come from the classical problem of calculating the duration of the swing, or period, of a pendulum. The study of these quantities

www.euresearcher.com

inspired whole parts of mathematics, notably in the fields of number theory and algebraic geometry. “The numbers that came up in these old physics questions, which would now be called pure periods, were completely new at the time. Nowadays mathematicians understand these pure periods very well,” explains Dr Brown. Despite the fact that pure periods have been studied in mathematics for over a hundred years, it has only recently emerged that they form a tiny part of a much larger class of numbers called

69


mixed periods. Dr Brown’s work focuses on the study of mixed periods, which is an active area of research still in its infancy. “Towards the end of the last century mathematicians realised that pure periods were only the tip of the iceberg. Mixed periods form a much deeper and richer class of numbers that we are only just beginning to understand. They connect in surprising ways with many different branches of mathematics. The modern theory of periods, in the general sense, goes far beyond the questions they were originally designed to answer,” he outlines. Much of the project’s research revolves around mixed periods, and aims to gain a deeper understanding of their

between elementary particles and the fundamental forces of nature. The standard model is an example of a quantum field theory, and provides the most complete description of subatomic physics known to date. The theoretical predictions borne out by this theory are then tested in experiments at particle accelerators such as the Large Hadron Collider. In many cases theory and experiment agree to extraordinarily high levels of precision. Quantum electrodynamics, the quantum field theory which describes the interaction of light with matter, is one of the most precise physical theories ever developed, but this does not mean it is easy to use. In fact, the calculations are often so complicated it can be extremely

FIGURE 3. A Feynman diagram which Brown and Schnetz proved to be ‘modular’. This has disproved a number of open conjectures in the field.

mathematical structure and the reasons for their appearance in so many different domains. “In some sense, not all numbers are created equal in mathematics. Periods are one of the most important classes of numbers – those which seem to occur most frequently in lots of different problems. In fact, one of the reasons for the recent renaissance in periods is because of their ubiquity in particle physics”.

Quantum Field Theory Quantum field theory is the branch of physics which describes the interactions

70

Feynman diagram, and each diagram has a certain probability, called its amplitude. The amplitude, which is a number, is given by a mathematical recipe called a Feynman integral, but calculating it can be fiendishly difficult. “There are cases where physicists have spent years of their lives working on a single Feynman integral, and at the end you still have to add up vast numbers of these amplitudes to get the answer you want. It’s a phenomenally laborious process,” explains Dr Brown. With entire teams of physicists devoting years to this task, Dr Brown is keen to find more efficient means of computing Feynman amplitudes. The answer, according to Dr Brown, lies in the theory of periods. “We know that

FIGURE 4. Another modular Feynman diagram, without self-crossings.

hard to work out what the theory predicts. “Even though quantum field theories are often very elegant and simple on paper, extracting a practical prediction from the equations is an extraordinarily complicated mathematical process,” explains Dr Brown. In quantum field theory, anything that can happen, will happen. What quantum field theory does is to predict how likely it is that a possible interaction between subatomic particles will happen. In practice, this is a Herculean task: each one of a multitude of different possibilities is represented by a drawing called a

Feynman amplitudes are always periods. So we can bring to bear all the modern techniques of number theory and algebraic geometry to this problem”. The project is using the mathematical understanding of periods to try and improve the evaluation of Feynman diagrams, rather than the laborious processes currently used. Dr Brown’s team is currently putting into practice a new approach he developed a few years ago. In theory this method goes much further than those which are currently being used. “Right now, physicists have a dozen or so methods for

EU Research


At a glance

evaluating Feynman amplitudes, which have advantages and disadvantages in different situations. What we’re proposing is a single method that doesn’t need to be adapted every time; you just plug in your problem and it spits out the answer. On paper, at least, it goes far further than any of the current methods,” he outlines. The question for now is just how fast and efficient this method is in practice, which won’t be known until the project is completed.

Multiple zeta values Besides thinking about practical applications for physicists, Dr Brown has recently made a major breakthrough in pure mathematics, which earned him the Elie Cartan Prize of the French Academy of Sciences. In 2011, he announced a proof of the Deligne-Ihara conjecture, which was the central problem in the field of multiple zeta values, and had remained intractable for about 25 years. “Multiple zeta values are a very special kind of period: they are the first, the nicest possible family of mixed periods,” explains Dr Brown.

class; in other words, nature only knew about multiple zeta values”. After working on this problem for several years, Dr Brown and his collaborators showed that, while this seems to be true in many examples, this prediction is actually false. “At some point, the picture begins to change radically, and physics goes totally outside the box that everyone hoped it was in. What this means is that quantum field theories are mathematically far more complex than anyone dared to imagine. It completely changes the picture. The question is now: if the numbers that particle physics is producing aren’t multiple zeta values, then what are they?” This is one of the many problems that the PAGAP project seeks to address. In the future, Dr Brown plans to continue his work at the boundary between maths and physics. “I’m very interested in incorporating these ideas from physics, quantum field theories, into pure mathematics and using them as inspiration,” he says. The fundamental

These numbers have a venerable history: first discovered by Leonhard Euler in the eighteenth century, they were almost forgotten for a few hundred years until it was realised that they play a central role in the theory of periods. “In the 1980’s, all of a sudden, multiple zeta values started to reappear in all sorts of different branches of mathematics: knot theory, number theory, and, most importantly, in physics; in quantum field theory”. In fact, it was expected for many years that a whole class of Feynman amplitudes should be multiple zeta values, and all the evidence pointed in this direction. “The experts believed that the only periods that ever came up in quantum field theory were in this very special

www.euresearcher.com

nature of this kind of work makes it very difficult to predict how research will develop, but Dr Brown is keen to encourage further collaboration between physicists and mathematicians. “There are very few people who can speak the language of both pure maths and physics, and cross-fertilise ideas. Particle physicists are studying some fantastically beautiful objects, which mathematicians could definitely help with, and mathematicians would benefit enormously from learning about these new structures and thinking about them,” he says. “So one of our goals is to train up people who are fluent in both these areas and who can build relationships between maths and physics.”

Project Objectives To undertake a systematic study of periods in pure mathematics, especially in the context of moduli spaces, and to relate these fundamental mathematical objects to quantum field theories in high energy physics, bringing to bear modern techniques from algebraic geometry to this emerging interdisciplinary area. Project Partners • Xxxxxxxxxxxxx • Xxxxxxxxxxxxx • Xxxxxxxxxxxxx Contact Details Project Coordinator, Dr Francis Brown Institut de Mathématiques de Jussieu 175, Rue du Chevaleret 75013 Paris France T: 00 +33 1 45 38 57 98 E: brown@math.jussieu.fr W: www.math.jussieu.fr/~brown/

Dr Francis Brown

Project Coordinator

Francis Brown received his bachelor’s degree from Cambridge University, before going on to study at the Ecole Normale Supérieure in Paris, and completing his Ph.d under the supervision of Pierre Cartier in 2006. He is interested in periods of motives in algebraic geometry, and their relation to renormalizable Quantum Field Theories.

© CNRS Photothèque / Frédérique PLAS

Particle physicists are studying some fantastically beautiful objects, which mathematicians could definitely help with, and mathematicians would benefit enormously from learning about these new structures and thinking about them

Full Project Title Periods in Algebraic Geometry and Physics (PAGAP)

71


Smarter RF microsystems for satellite links The FLEXWIN project develops smart radio frequency microsystems, using a highly innovative technology platform to realize re-usable, re-configurable and multi-functional circuits. This will enable new RF-system architectures and reduce time-to-market, as project coordinator Volker Ziegler explains An

EU-backed initiative bringing together academic and commercial partners from across Europe, the FLEXWIN project is developing smart radio frequency (RF) microsystems, which will have greater capabilities and flexibilities than more conventional RF-systems. The project is combining several different ideas to establish this highly innovative technology platform, work which project coordinator Volker Ziegler says is targeted at clear commercial needs. “The goal of FLEXWIN is to develop antennas for mobile satellite communication links, for use in data transmission from an airplane to the satellite, where there is a clear need for low-cost, planar antennas which produce low drag on the aircraft. The second big aim is to develop reconfigurable receivers for base stations,” he outlines. Currently communication signals are transported up to satellites in one of two ways, depending on the intended data transmission rate. “One is using frequencies in the L-band, which gives a rather low data rate,” explains Ziegler. “The other one, which is also used for television, is in the Ku-band. But, most of the commercially available antennas up to now are based on mechanical steerable systems, so they are not really planar and aren’t ideally suited for use on planes.”

Planar antenna Researchers believe that low-cost and active planar antennas would bring significant improvements, as they are electronically steerable, while the project is also working to move transmission from the Ku-band to the Ka-band. This would operate at higher frequencies, allowing the transmission of more data with smaller antennas. “We will have only one antenna aperture for transmit and receive, so you have more opportunities to integrate it on the plane,” points out Ziegler. The project

72

is developing both the antennas and the individual key components on which the technology depends. “We develop the individual chips in the antenna – combining the RF Microsystems with semiconductor technology to realise the multifunctional microwave circuitry. But, we are also looking at the whole antenna, how it can be built on a scalable basis with the radiating elements, integration and assembling technologies including cooling structures. Essentially we are developing the single functional building blocks, and are also building an antenna array demonstrator, which will be a part of the final antenna system. Our partial array will consist of 25 chips addressing up to six antenna elements each,” explains Ziegler. “The new aspect of our work is that the RF-microsystem is

receive chips which realise the functionality of transmitting and receiving the signal; however, if you do not have the digital circuitry on this chip, you have to route maybe 20-30 control lines to each. And if the chips are densely spaced in the antenna array then you cannot realise all these control lines,” explains Ziegler. With digital and mixed-signal circuitry on the chip, Ziegler says it is possible to achieve the highest level of functionality. “You can do a kind of daisy-chain type control of the chips. So each chip has a digital

The goal of FLEXWIN is to develop a smart RFmicrosystem technology, which enables antennas for satellite communication links, for use in transport from an airplane to the satellite, where there is a clear need for low-cost, planar antennas which produce low drag on the aircraft. The second big aim is to develop reconfigurable receivers for base stations monolithically integrated with the semiconductor technology based on silicon germanium and BiCMOS. This technology enables a lot of functionalities on one chip.” These include BiCMOS for digital or mixed signal circuitry, silicon germanium transistors for the active analogue circuitry and RF-MEMS (micro-electromechanical system) for the switching and reconfiguration parts. This multifunctionality helps reduce the overall complexity of the antenna system. “We have a lot of multi-channel transmit-

address, and you only have to route these digital bus lines – maybe six of them – in a daisy-chain like manner through to all of the chips. Then you do the digitalanalogue conversion on the chip. This is why this multi-functionality is a big enabler for the planar antenna system, which cannot be made in any other way,” he continues. “The analogue elements are likewise crucial to the satcom antenna. You cannot develop the antenna without them because you need all the amplifications, switching, as well as the phase and amplitude settings.”

EU Research


At a glance Full Project Title Flexible Microsystem Technology for Micro- and Millimetre-Wave Antenna Arrays with Intelligent Pixels (Flexwin)

Antenna Frontend architecture

Reconfigurable chips The project is developing micro- and millimetre-wave switches based on an advanced microsystem technology monolithically integrated with SiGe BiCMOS, so that the chips are reconfigurable and can be used for multiple applications. “For the communication antennas that we are developing we have a transmission frequency of 30 GHz and a receiving frequency of 20 GHz, which should all be integrated on one chip and one antenna. This is why we use the MEMS switch here to route the signal with the lowest possible losses between the transmitting and the receiving path of the antenna,” says Ziegler. “Furthermore, we are addressing base stations for terrestrial mobile communications. Highly reconfigurable receiver circuits will enhance their flexibility to adapt to changing frequency allocations and communication formats.” Additionally, the project is working on establishing the concept of reusable chipsets for millimetre-wave frequencies, which will be produced once and can be reconfigured afterwards to fit to a specific application. This approach is also supported by the technology platform currently under development in the project.

Commercial potential From here Ziegler is keen to establish a follow-up project to FLEXWIN to produce a full-scale demonstrator. This would include around 2,000 elements, so scalability of the technology is a key issue, particularly in terms of any potential

www.euresearcher.com

commercial applications. “We are trying to develop the demonstrator in a way that’s easily scalable. All the things that we use in FLEXWIN; the integration technologies, the cooling structure and so on, are done in a way so that they are scalable to a larger size. This is very important in terms of commercialisation – there’s strong interest from industry in the project, so we want to ensure we can exploit it later on,” outlines Ziegler. The current exploitation plans centre on specific areas; among the commercial partners, Ericsson are focused primarily on base stations, while EADS are looking more towards aeronautics. “On the one hand EADS is an avionics systems supplier for planes or satellites, while on the other we also build planes,” continues Ziegler. “So we have a lot of applications for these planar antennas on aeroplanes. Satcom is the major one, but we also have different radar applications.” The two main applications of this technology are in mobile communication antennas and base stations, but Ziegler is keen to stress that the project’s research also holds wider potential, such as in the automotive and communication industry. For a long time the automotive industry has used two frequencies – 24 GHz for shortrange detection and 77 GHz for long-range detection; Ziegler believes the FLEXWIN project’s research could lead to significant improvements in this area. “You could make a radar chip capable of switching between these two bands or covering the new 77-81GHz frequency-band,” he suggests. “Additionally, chips could be reconfigured to operate in the two E-Band frequencies (71-76GHz and 81-86GHz) for ultra-high data rate communication.”

Project Objectives • Antenna structures with built-in intelligence • IC design for reusability • Enhanced functionality of the underlying semiconductor technology Project Funding EC contribution: €3,100,000.00 Project Partners • EADS DEUTSCHLAND GMBH, IHP GMBH, UNIVERSITAET ULM, Germany • UNIVERSITA DELLA CALABRIA, MIPOT SPA, Italy • UNIVERSITY OF SURREY, United Kingdom • ERICSSON AB, Sweden Contact Details Project Coordinator, Dr Volker Ziegler EADS Innovation Works 81663 Munich Germany T: +49 89 607 20294 E: volker.ziegler@eads.net W: www.flexwin.eu

Dr Volker Ziegler

Project Coordinator

Volker Ziegler has worked at the EADS Innovation Works since January 2003. He is an EADS Expert in Microwave Technologies and Systems and is responsible for the acquisition and management of national and international research projects in key microwave technologies for advanced radar and communication systems.

73


Understanding the importance of information It is rare for an economic agent to have access to all the information relevant to a particular decision, such as the identity of a rival bidder and the amount they’re willing to pay for a given asset. Christian Hellwig of the InfoMacro project spoke to us about his research into how lack of information about aggregate economic conditions influences economic outcomes The premise that

different economic agents have access to different sources of information, hold different views about economic conditions and take these differences into account in their economic decisions seems simple enough, yet it is difficult to integrate this with a fully developed model of fluctuations in the aggregate economy. This area forms the primary research focus of the InfoMacro project. “The main objective of InfoMacro is to gain a better understanding of how lack of information about aggregate economic conditions influences economic outcomes,” says Professor Christian Hellwig, the project’s scientific coordinator. Economic agents, whether they are firms, households or investors, tend to have only a partial view of all the things that are going on around them; Professor Hellwig points to the housing market as an illustration. “Say I’m looking at the house I might want to buy. There are other buyers in the market – I don’t know who they are, what they might be willing to pay, how they arrive at their decisions, and so on,” he outlines. “Now, in a micro context, let’s take a single house. We have a rich literature dealing with these issues; we have a theory of auctions that analyses the bidding behaviour that arises, how people make their decisions when they try to figure out what the house is worth, and the price that is likely to emerge in the end.”

Housing market Putting this information together into a market for all the houses in a particular area is far more complex however. The bidding behaviour on one house is not going to be independent of another, and these markets are also more broadly interconnected. “If you go one step further,

74

what’s happening to these houses is not independent of the labour market, of the jobs that the individuals bidding for houses are competing for. We can also consider wider dynamics – what I’m willing to bid on a house today is going to be dependent on what I expect others to bid on a house in the future. So I’ve just drawn a series of linkages across markets, across locations, across time,” explains Professor Hellwig. Researchers now have a good understanding of how to analyse these linkages when abstracting from these informational issues. “The basic insight of the rational expectations revolution of the ’70s was that to really understand macroeconomic fluctuations, we needed to start by thinking about decisions at the level of individuals, at the micro level,” says Professor Hellwig. “The first models of this were very simple, abstracting from heterogeneity across individuals, informational issues and frictions in the markets to build basic models of things like consumption and savings behaviour, price adjustment and investment behaviour.” Over the last thirty years economists have moved away from the assumptions inherent in these models, such as the principle that all economic agents share the same information, and started building more complex models. These models now include information on behaviour and data gathered at the micro level to allow for heterogeneity and frictions in the adjustment process. “We’ve started to get a better understanding of how these linkages work. At some level I’m taking this process a step further by developing methods with which we can relax some of the informational assumptions that were clearly very strong, and perhaps

unrealistic, in the basic models,” continues Professor Hellwig. Some assumptions are necessary as a basis for the development of economic models, but it is important to question them as part of ongoing refinement. For example, most asset market models used for macro purposes do not include informational frictions and have no-arbitrage assumptions about the functioning of the market embedded in them. “These models closely tie asset prices to fundamental values, almost as an operating principle,” says Professor Hellwig. “If that’s your fundamental assumption then fluctuations in the asset market are always interpreted as reflecting wider economic conditions.” This assumption is very contentious however, and the financial crisis of the last five years has triggered further debate on its validity. Professor Hellwig is looking at the connection between the wider economy and the asset market when this assumption is not validated. “One of the things I’m currently working on is a model of asset markets in which incomplete information, and the information that emerges endogenously through prices, offers us some departures from this efficient markets benchmark,” he outlines. A number of factors can drive changes in asset prices; the objective in developing this model is to try and understand whether the information to which investors have access can play a role in shaping fluctuations. “A model where everybody has the same information is a lot easier to analyse than a model where everybody has different information,” continues Professor Hellwig. “The first step has to be to understanding, at a general level, how the choices individuals make are going to interact. Here you’re

EU Research


www.euresearcher.com

75


At a glance Full Project Title Information heterogeneity and frictions in macroeconomics and finance (InfoMacro) Project Objectives This proposal develops theoretical and quantitative methods for the analysis of dynamic equilibrium models, in which different participants hold different views of the aggregate conditions, and are aware of the differences in beliefs. The project applies these methods to explore the impact of information on asset price fluctuations, business cycle dynamics, and the optimal design of information provision. Contact Details Project Coordinator, Christian Hellwig Toulouse School of Economics Manufacture des Tabacs 21 Allees de Brienne 31000 Toulouse T: +33 (0)5 61 12 85 93 E: christian.hellwig@tse-fr.eu W: http://www.tse-fr.eu/index. hp?option=com_content&task=view &id=839&Itemid=1

going to face a two-way connection. One is, individuals will try to forecast the market outcomes, but then they make decisions that influence these market outcomes and they can, over time, learn from what is occurring in the market.”

Research questions This feedback helps form expectations about the market and how it will evolve, while at the same time an economic agent’s decision-making is based on those expectations and the available information. Professor Hellwig says it becomes easier to predict how other individuals are going to act when everybody has access to the same information, which in turn has knock-on effects. “At some level the fact that it becomes a lot easier to predict the actions

These models are highly relevant for understanding asset market fluctuations, but the project’s primary focus is more on incomplete information problems, and bridging the gap between theoretical insights and quantitative evolution. The first step is developing a theoretical framework to analyse such problems. “We’ve talked about asset markets, but the same questions arise with household decisions about consumption and savings, and investment and pricing decisions by firms – any situation where economic decisions are based on fairly limited information relative to all the things that are going on at the same time,” points out Professor Hellwig. The second task is to apply the framework to understand how information frictions

The basic insight of the rational expectations revolution of the ’70s was that to really understand macroeconomic

fluctuations, we needed to start by thinking about the decisions at the level of individuals, at the micro level

Christian Hellwig

Project Coordinator

Christian Hellwig is Professor of Economics at Toulouse School of Economics. He worked at UCLA between 2002-2010, being promoted to Associate Professor with tenure in 2007. He has co-edited the Journal of Economic Theory since 2008 and been an editorial board member for the Review of Economic Studies since 2009.

76

of individuals in the market and how they value the assets makes it much easier for the market to allocate resources or value assets effectively,” he says. Professor Hellwig is also working on an asset market model based on the behaviour of individual investors. “Everybody in the model is assumed to act exactly rationally. So, we assume individuals in our model are patient, they know the statistical properties of the information they’re receiving, how to compute probabilities, and how to update their expectations. And yet the market outcome looks ‘irrational’,” he outlines. “That is, if you look at the market outcome, and you think of the market as an individual, you would have to conclude that that individual doesn’t know how to compute probability. Or that individual looks like they’re too confident in the quality of information that’s generated.”

affect things like asset markets, pricing and consumption behaviour. “Here we are moving from the level of individual decisions towards aggregates, and we are calibrating the models to match properties of the micro data on individual decision-making,” says Professor Hellwig. “I think the third step is testing how these models perform quantitatively in the aggregate, and also testing whether incomplete information has major effects on these fluctuations, where information plays an important role for market outcomes and when it’s role is secondary, which is the goal we’re trying to reach. Finally, there are important policy questions – for example, how private companies as well as policy institutions like the central banks can influence outcomes by providing information to others.”

EU Research


New resources for Europe’s policy-makers At a time of widespread cynicism about the political process, technology offers a way to enhance transparency and give citizens a chance to contribute to the debate. David Osimo of the Crossover project explains how technology and social media can be harnessed to build a new knowledge base that takes full account of human behaviour

G

overnments across the world are keen to harness the power of technology as a means of enhancing transparency, gathering data and understanding the impact of specific policies. Today, technology is increasingly being viewed as a key component in the policy-making process, giving government’s important data which they can use in policy development. At a time of widespread cynicism about politics, technology offers a way for Governments to communicate with citizens and engage with a wider range of people on the major issues they face. How can this help Government find a ‘third way’ between a restricted policy debate dominated by political elites and the open discussions found on social media sites? The CROSSOVER project, an initiative co-funded by the European Commission, is at the vanguard of research in this area. Bringing together and reinforcing links between researchers and experts, the project aims to establish a knowledge base of real life applications and frontier research, offering an invaluable resource to policy-makers. As the Director of Tech 4i2 Ltd, an advisor about online engagement for the Digital Agenda for Europe and a key figure in CROSSOVER, David Osimo is well qualified to contribute to the debate. We spoke to David about the work of the CROSSOVER project, its ideas and its research, and its likely impact on the way we are governed.

Consultative approach EU Researcher: Does the goal of encouraging innovation and technological development demand a consultative approach to policy-making? David Osimo: Traditionally you might expect to have an expert decide what particular projects should be funded and what technologies should be researched and so on. However, we now realise that technological developments are not predictable in a linear way, going from laboratories to market, and that instead they need to be continually improved by bringing the maximum possible number of intelligent people around the table. Real innovation comes from the edges. The capacity to innovate is correlated to the capacity you have to reach out – if you are able to reach out to new people you are more able to innovate. This is true for research, as it is for policy-making. EUR: Is it about engaging with a wider range of people? Is this likely to have a political impact? DO: Absolutely. Our goal is not only to promote innovation, it’s also to promote better policies. So it’s not really about encouraging better policy-making in order to promote innovation, it’s rather the other way round; using innovative tools to develop better

Technological projects are not predictable in a linear way, they need to be continually adjusted to bring the maximum possible number of intelligent people around the table. Real innovation comes from outside the boundaries of your network www.euresearcher.com

77


policies. And we’re covering applications and tools which are basically global in nature. EUR: Can ICT issues be dealt with by traditional power structures and modes of government? Or do they demand a new approach? DO: The technological reality today is so complex and fastevolving that it’s very important for governments to consult widely in policy development. It’s no longer the case that someone in a room can decide what is good for a country, or for government to think that it can have all the good ideas. In order to ensure good governance, governments need to be accountable, but also to be able to reach out to external intelligence. Just in the same way as successful companies do. This means not just asking one designated expert, but rather asking questions and reaching out to people you don’t know, from outside the established networks, as well as the people you do know.

Social media EUR: What role does social media play in this? DO: Social media offers great opportunities, but the goal is to attract relevant people through social media, not to just attract large numbers of people. Rather than gathering a representative sampling of 1,000 respondents, you want to somehow let the good ideas emerge from the discussion in organic fashion. It’s about on the one hand being open to anyone, but it’s also very selective. EUR: Are these technologies aiming to provide access to data in real time? DO: One of the key ideas is that technology makes it possible to collect large amounts of data much more rapidly and make it available in real time. For example, you can use the sensor capacity of smartphones to collect data. Let me give you just one example; one of the applications we are studying, Carbon Diem, can be installed on your smartphone. Based on your movement, recognised by GPS, it understands

what transport you’re using and the carbon footprint of your behaviour. The city can use this data to plan transport and control carbon emissions. EUR: Are these applications designed to be quite unobtrusive? Do they require much effort on the part of the user? DO: These technologies are very refined in terms of design and usability. Rather than imposing change on humans, as traditional software used to do, they take full account of human psychology. For instance, one important trend is called ‘gamification’. This means that you can exploit fundamental human instincts such as the desire for friendship and fun to encourage positive behaviour. EUR: Do you see these tools as a way to address major societal challenges? DO: Absolutely. They are vital to many societal challenges. A key example is health; last year, 9 per cent of Americans had at least one health application installed on their smartphone. This includes applications that encourage you to exercise, applications where you record and share the quality of the food you eat and so on. Similar apps are available to encourage more sustainable consumption. EUR: Will this give policy-makers a sounder evidence base on which to base their decisions? DO: Yes. It’s not only about sounder evidence, but also better data. One of the key aspects of evidence-based policy-making is understanding causality. For instance, the reason that Europe doesn’t have more start-ups is because of the lack of venture capital. These technologies are used to aggregate data, to explore the links to detect causal relationships that we may not previously have been aware of. Then we can model this system of cause and effects, in order to simulate the impact of specific policies.

These technologies are used to aggregate data, to explore the links to detect causal relationships that we may not previously have been aware of. Then we can model this system of cause and effects, in order to predict the impact of specific policies

effects, in order to predict the impact of specific policies been aware of. Then we can model this system of cause and to detect causal relationships that we may not previously have 78 These technologies are used to aggregate data, to explore the links

EU Research


Health applications EUR: Can these technologies have an impact on both the individual and collective level? DO: Yes. Take the problem of alcohol consumption, which is a health policy problem. Traditionally, you’d say either raising taxes or prohibiting alcohol altogether would be the main tools to make people drink less. However, there is evidence that this fails because you don’t account for the fundamental driver of people’s behaviour, which is their friends. Technology enables you to make these social networks transparent, to understand people’s behaviour and to use these networks to encourage positive change. Behaviour is now a fundamental aspect of public policy. A major challenge like climate change cannot be solved by Government alone for example, it is a shared responsibility that requires shared action.

DO: This project will end in March 2013, but there are other projects that are doing similar things, so development is ongoing. We plan to organise a conference in the U.S, while at the end of the project we are launching a prize for policy software, which we expect all the best software producers worldwide to compete for. This will include not only software, but also the real-life applications of this software. So you have to demonstrate not only that you have an innovative product, but also that you have implemented it.

www.crossover-project.eu

EUR: Do you have any plans to extend the Crossover project?

David Osimo

www.euresearcher.com

79


A new theory for new systems A new theoretical framework is required to describe information encoded in a single atom or photon, as such systems don’t behave according to the laws of classical physics. We spoke to Professor Renato Renner, coordinator of the GEQIT project about their work to develop a new, generalized theory of classical and quantum information Over the last twenty years it has become possible to use quantum systems, such as single atoms or photons, to represent and process information. These systems do not behave according to the laws of classical physics. “An atom can have properties which are very counter-intuitive. To illustrate, if you look at an atom, it will change its state. Scientists quickly realised that such properties could be exploited, for example for security applications,” says Professor Renato Renner. As the coordinator of the GEQIT project, Professor Renner is working to establish a theoretical framework underlying such applications. “A whole new research field, called quantum information theory, has emerged. Its goal is to investigate the possibilities that the use of quantum systems offers for information processing,” he outlines. “But a central idea of the GEQIT project is to also go back in the opposite direction and see if we can learn new things about physics from the insights that we’ve gained from quantum information theory.”

as heat and temperature. At first sight these variables seem unrelated to uncertainty or data compression,” continues Professor Renner. “Nevertheless, the mathematical formulas for the physicists’ notion of entropy and the information theorists’ notion look almost identical. This is surprising. However, using recent insights in information theory, we are starting to understand why they are related and can now make connections between them.” These connections can be exploited to transfer insights from physics to information theory and vice versa. Historically, information theory has already learned a lot from physics; in fact, when Shannon developed information theory, his

aim was to develop a fully general mathematical theory that was independent of the physics of the information carriers. “However, Shannon’s theory doesn’t correctly describe information carried by quantum systems,” explains Professor Renner. “Quantum information theory extends Shannon’s theory to include quantum information carriers. But the theory also modifies Shannon’s theory in the sense that certain things that were assumed to be generally true turned out to be so only in special cases. For example, Shannon stated that information can always be copied as we like. This is however no longer true in quantum physics, as the act of reading out the information encoded

Entropy This represents a new approach to physics, using the laws of information to address unsolved problems. One particularly important concept in information theory as put forward by Shannon in 1948 is the notion of entropy. “Entropy is a measure of uncertainty, and therefore also quantifies information. The more information I have about something, the less uncertain I am about it,” explains Professor Renner. Shannon showed that entropy has an operational meaning. The entropy of a data source, for instance, tells us about the minimum size to which the data can be compressed. The concept of entropy also plays an important role in physics, but in a completely different context. “For example, if you want to analyse the performance of a steam engine – how efficiently it transforms heat into work – then you need to rely on the physicists’ version of entropy. It is defined via thermodynamic variables such

80

Figure 1: Information is always represented by physical systems (see reference at end). This (seemingly innocent) observation has important consequences. In particular, any (realistic) theory of information needs to take into account the physical nature of the systems used for the storage and processing of information. The development of quantum information theory (QIT) as a generalization of Shannon’s classical theory (as well as quantum computation as a generalization of the classical theory of computing) over the past few decades was therefore a necessary consequence of the insight that our physical world is non-classical (green arrows pointing to the right). Over the past decades, an impressive collection of methods and techniques have been established in (both classical and quantum) information theory. It is thus natural to ask whether this knowledge can be transfered back to the area of physics. It turns out that this is problematic in many cases, as the established information-theoretic techniques are often based on unphysical assumptions. One of the aims of Renner’s research is to remedy this situation by developing a generalized QIT, which does not rely on such assumptions, thus making it usable in physics (red arrows pointing to the left). Landauer, Rolf (1999) Information is a physical entity, Physica A 263, 63-67.

EU Research


in a quantum system unavoidably changes it, rendering it impossible to copy.” The generalied theory that Professor Renner is developing goes one step further than existing information theory. While information theory is designed to study data processing and communication, Renner’s aim is to make it applicable in physics, too. “We want to develop a general theory, so that mathematicians, information theorists and physicists are all happy with it – we want this generalized theory to meet the needs of all these groups,” he says. This theoretical research is necessary to study the ever-smaller systems we use to process information, underlining the wider importance of the project’s work. “If miniaturisation continues to progress at the same speed as we’ve seen in recent years, the structures on a chip will soon be of the magnitude of a single atom. But even before we reach that limit, the laws of quantum theory will become dominant,” predicts Professor Renner.

where an adversary may tamper with the information, thereby destroying its structure. Another related application is the generation of random numbers using quantum processes. Professor Renner and his colleagues are collaborating with a Geneva company which produces random number generators, a more complicated task than might be thought. “The problem is that if you really want the outcome to be random, then it’s not sufficient to just toss a coin. In fact, the outcome of a coin toss is not really random – it is governed by the laws of classical physics, which are completely deterministic,” he explains. “However, in quantum mechanics there are processes that have a truly random outcome, and these processes can be realised experimentally. In other words, there are quantum-mechanical experiments for which it is completely impossible – we can even prove this mathematically – to predict the outcome.”

We can take a step back and ask; what happens if quantum theory is not correct? Would we still arrive at the same theory of information? It turns out that if quantum theory was different, then the corresponding information theory derived from that different theory might look very strange Beyond quantum information theory The basic feature of quantum information theory is that it assumes information carriers are correctly described by quantum mechanics. “However, we can take a step back and ask; what happens if quantum theory is not correct? Would we still arrive at the same theory of information? It turns out that if quantum theory was different, then the corresponding information theory derived from that different theory might look very strange. So strange actually that we can infer that quantum theory must really be as it is,” outlines Professor Renner. “In other words, if quantum theory was different we would arrive at such unnatural conclusions about the behaviour of information that we wouldn’t believe in such a theory.” Despite these philosophical ramifications, the theory developed within the GEQIT project also has very direct practical applications, such as in quantum cryptography. “We can use our theory to analyse information that has no internal structure,” outlines Professor Renner. This is particularly important in cryptography,

www.euresearcher.com

This holds real importance to issues like internet security. When computers communicate with each other they both locally generate randomness to set up a secret key, so they need the ability to generate random numbers. “Computers today do this in a very insecure way – they just take a number from something that looks random, but may not be random,” says Professor Renner. However, these practical applications are not the main focus of Professor Renner’s work and he plans to pursue further basic research in future. “I’m interested in understanding the thermodynamics of small systems. I would like to ask questions such as ‘how small can we make an engine and still have it function effectively?” he outlines. “To answer such questions, conventional theoretical tools do not apply – we can’t just take a large thermodynamic machine and make it smaller – there are quantum mechanical laws that start to be relevant for the description of that machine. But, making use of the recently discovered links between thermodynamics and information theory, we are optimistic that we will eventually find answers to these questions.”

At a glance Full Project Title Generalized (quantum) information theory (GEQIT) Project Objectives Quantum information theory is an increasingly active branch of science. Numerous important insights and powerful techniques have emerged from research in this area over the past few decades. A central aim of Renner’s research (and teaching) efforts is to make these usable for the study of both theoretical and applied problems in physics. Project Funding Funded by the European Research Council Contact Details Project Coordinator, Professor Renato Renner Institut für Theoretische Physik HIT K 41.2 Wolfgang-Pauli-Str. 27 8093 Zürich Switzerland T: +41 44 633 34 58 E: renner@phys.ethz.ch W: http://www.phys.ethz.ch/ • Colbeck, Roger and Renner, Renato (2012) Free randomness can be amplified, Nature Physics 8, 450-454. • del Rio, Lidia et al. (2011) The thermodynamic meaning of negative entropy, Nature 474, 61-63. • Renner, Renato (2012) The fridge gate, Nature 482, 164-165. • Berta, Mario et al. (2010) The uncertainty principle in the presence of quantum memory, Nature Physics 6, 659-662.

Professor Renato Renner

Project Coordinator

Renner studied physics, first at EPF Lausanne and later at ETH Zurich, where he graduated in theoretical physics. He then worked on a thesis in the area of quantum cryptography. After getting his PhD degree, he spent two years in the UK, where he was a HP research fellow in the Department for Applied Mathematics and Theoretical Physics at the University of Cambridge. His research interests are in the area of Quantum Information Science.

81


Some passengers find the aircraft cabin environment uncomfortable, with poor air conditioning being one of the most common complaints. By developing personalised climate systems the iSPACE project aims to give passengers individual control over humidity, temperature and airflow at seat level, as project coordinator Dr Gunnar Grün explains

Climate systems for more comfortable flying For some passengers flying can be an uncomfortable experience, with poor air conditioning being one of the most common complaints. As the coordinator of the iSPACE project, Dr Gunnar Grün aims to give passengers some level of control over their immediate environment. “The overall aim of the project is to bring personalised climate systems into the aircraft cabin,” he says. These personalised systems will be integrated into business and first class seats, giving passengers the ability to modify temperature and humidity to a level that suits them. “Airlines receive many complaints that it’s either too hot or too cold and draughty, and of course it’s also known that the aircraft environment is very often perceived as quite dry. So these environmental parameters are the main aspects we focused on,” outlines Dr Grün. “In looking at temperature, air flow and humidity, we aim to give the passenger the option to control them and to create their own comfortable microenvironment.” 82

Humid air The usual medium by which temperature and humidity are modified is air, so the project is also looking at ventilation patterns, which are particularly relevant for the humidification case. Usually one system controls ventilation levels for the whole aircraft, but the project has integrated independent systems into the seat which give passengers control of their local environment. “We said; ‘ok, there are lots of complaints. More complaints than you would have in less densely occupied spaces than an aircraft cabin.’ If people get the opportunity to control the cabin environment themselves, then maybe we could get to a level where most of the passengers are satisfied,” says Dr Grün. Humid air can be brought in to the local environment, but it needs to be quite close to the passenger to have the desired effect. “When you blow humid air with very high velocity you’re actually drying the passenger, even though you are bringing in humid air. So it’s quite a demanding problem compared to modifying

temperature, where you usually use simply warm or cool air,” explains Dr Grün. Humidity, temperature and ventilation are generally inter-related, but Dr Grün says the passenger’s perception of them isn’t. “You cannot change humidity without considering temperature, because the air can carry a certain amount of water, depending on how warm it is, so they are necessarily inter-related. But if you feel that it’s dry, then this is not directly related to sensory temperature perception,” he outlines. However, the project is not working in the range where the inter-relation between those factors and the perception of them is very critical. “If it’s very humid and very hot passengers will be uncomfortable. But we are not in the temperature range inside the aircraft cabin where that’s very important,” says Dr Grün. “So for the perception side it’s not too problematic, but it is in terms of providing warm and humid air.” A passenger may want to bring humid air towards their eyes for example, as described earlier, but blowing it with high

EU Research


velocity counteracts the intended impact. However, passengers can modify the humidity level without having an effect on temperature. “When you humidify air it first cools down a bit, due to thermodynamics. But in our system the passenger can then heat it up again, so that we don’t have air with different temperatures when a passenger wants more humid air,” explains Dr Grün. Of course some passengers may be perfectly happy with the ventilation levels and not be keen on being subjected to a quick blast of humid air from the person in the next seat; Dr Grün says the choice of one passenger will not have an overt impact on their neighbour. “For example, if you look at these features integrated into the seat, like seat heating or heating mats, with ventilation inside the seat – this doesn’t affect the neighbouring passenger. If you have a small nozzle which blows air towards yourself, it sometimes has an impact but most of the time it doesn’t,” he stresses. These principles have been rigorously tested in the project, with a very small

Premium passengers A passenger flying on a short-haul, nofrills flight might be willing to tolerate this, but somebody who has paid for a premium ticket is less likely to grin and bear it. With many airlines offering the same routes, comfort offers a key way for airlines to differentiate themselves from their competitors and attract premium passengers, who are a crucial segment of the market. “Premium passengers are an important group for the airlines. They want to be able to say they are offering the most comfortable seat,” says Dr Grün. A stakeholders club was established during the project, to which airlines were regularly invited to test the systems and provide feedback. “The airlines told us they could imagine using such systems to attract other customers,” continues Dr Grün. “It’s very important that passengers quickly recognise that something is happening when they use the system. When you don’t perceive an effect immediately, then you play with the system more and more and you’re not satisfied at all.”

We focused mostly on temperature and humidity in the cabin environment. Airlines receive many complaints that it’s either too hot or too cold and draughty, and of course it’s also known that the aircraft environment is very often perceived as quite dry central processing unit used to control humidity and temperature. This is particularly relevant for long-haul flights, where Dr Grün says comfort is of primary importance. “Your perception of dry air, for example, usually becomes active after 3-4 hours. That’s when symptoms and complaints arise,” he explains. Issues like external temperature and altitude, while of course important to the aircraft itself, aren’t usually major considerations in terms of the cabin environment, but can affect passengers seated near the windows. “The outside temperature is -50° or so during flight – one place where passengers will recognise this is in the window seats, where the lining is cooler,” explains Dr Grün. “This is an aspect we looked at – when you have a seat which is in the inner part of the plane it’s different to the outer part. The person by the window – or next to the emergency exit – can’t control their temperature at the moment. That’s one of the reasons why preferences inside the cabin differ and why we looked into this.”

www.euresearcher.com

The ultimate aim of this work is to give passengers the ability to control their local environment so that they can enjoy a more comfortable flight. These systems have been tested in the Fraunhofer Institute’s flight test facility using real subjects, and the project is looking to improve them further. “We built these prototypes into our aircraft cabin and simulated flights under usual cabin pressure levels. We regulated the global cabin control to predefined levels, and then the passengers in these tests could fiddle with all the different systems and try to find their optimum solution,” says Dr Grün. The goal of improving the systems further forms an important part of Dr Grün’s future research agenda. “We will certainly look more into such systems, to make them easier to use and to understand the demands of passengers, especially when it comes to local a-symmetries in the climate, like cold window seats. This is not just a topic for aviation either, it’s also a question for the automotive sector; how does the body react to such local a-symmetries?” he asks.

At a glance Full Project Title Innovative Systems for Personalised Aircraft Cabin Environment (iSPACE) Project Objectives The primary objective of iSPACE is to provide a step-change in passenger comfort during flight by providing aircraft manufacturers and the supplier industry with the knowledge and innovations needed to address the individualisation of passenger cabin environment. Therefor new concepts and technologies are developed as well as suitable simulation tools. Project Funding FP7 Collaborative Project. EU contribution: €2,675,962 Project Partners Airbus Operations GmbH, DE • Brno University of Technology, CZ • Contour Aerospace Ltd , UK • EADS Deutschland GmbH, DE • Icon Technology & Process Consulting Ltd, UK • Medical University of Vienna, AT • Pall Manufacturing UK Ltd, UK • W.E.T. Special Products GmbH, DE • Streit-TGA GmbH & Co. KG, DE Contact Details Project Coordinator, Dr.-Ing. Gunnar Grün Head of Department Indoor Climate Fraunhofer Institute for Building Physics IBP, Fraunhoferstraße 10 83626 Valley, Germany T: +49 (0)8024 643 228 F: +49 (0)8024 643 366 E: gunnar.gruen@ibp.fraunhofer.de W: www.ibp.fraunhofer.de Tsikouris, K.; T. Fišer, J.; Noeske, I.; Trimmel, M.: Detailed Simulation Study and Subject Testing of Individualised Aircraft Cabin Suites Environment. In: Proceedings of 5th International Conference from Scientific Computing to Computational Engineering (5th IC-SCCE), Athens, Greece, 4-7 July, 2012.

Dr.-Ing. Gunnar Grün

Project Coordinator

Dr. Gunnar Grün has been head of the indoor climate department at the Fraunhofer Institute for Building Physics since 2011. In this work he deals with thermal comfort and thermoregulation, indoor climate control and individual climatization and computer fluid dynamics and zonal models, among many other areas of research.

SPACE 83


EU Research

For more information, please visit: www.euresearcher.com

EU




Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.