The University of Southampton

Date:
2010-2013
Themes:
Modelling and Simulation, Applied Electromagnetism
Funding:
EPSRC

Plasma electrolytic oxidation (PEO) is a novel surface engineering technology, allowing relatively thick oxide coatings to be formed on metal parts. The process is superficially similar to the more familiar hard anodizing. The substrate is immersed in an aqueous electrolyte and a high potential, usually AC, is applied to it. The voltage between substrate and electrolyte rapidly rises as the native oxide thickens, and within a few minutes reaches several hundred volts. Sparks then start to appear on the surface, and this persists throughout processing. Recent work at Cambridge has confirmed that these are optically active plasmas, with durations ranging from tens to hundreds of microsecond, and peak temperatures up to ~10,000 K. These discharges allow oxide growth to proceed, so as to produce films with thicknesses of up to 100 microns or more. Moreover, the discharge events have a profound effect on coating microstructure, and hence on its physical and mechanical properties. It’s thus possible to produce thick, strong coatings, extending the utility of such oxide films to encompass protection in tribologically and chemically aggressive environments, and also to offer significant thermal barrier function. The process is particularly effective on Al, on which it can generate thick, highly-adherent, hard and wear-resistant coatings.

Among the attractions of the process are that it involves very few health or safety hazards, with the electrolytes required containing neither the concentrated sulphuric acid nor the chromate ions necessary for hard anodizing. PEO coatings can be grown using solutions as simple and dilute as 0.02 M KOH, and are generally so mild that they do not require special chemical disposal, since they are less harmful than many household cleaning products. This is an advantage of increasing significance. Furthermore, coatings of uniform thickness can quickly and easily be produced on components with complex surface geometry, over a wide range of sizes, with no requirement for chambers or special environments. This cannot be said of most other coating techniques, such as thermal spraying, ion beam plating, sputtering etc.

In view of these advantages, PEO has recently attracted intense commercial and academic interest. Much of the research carried out hitherto has been aimed at characterisation of the coatings, and process optimisation through empirical observation. While several attempts have been made to explore the underlying coating formation mechanisms, many basic questions remain unanswered. Progress has been made in observation of discharge characteristics. Nevertheless, the development of an integrated and comprehensive model of the process remains to be achieved. In the project use of recently-developed plasma analysis techniques, focussing on single discharge events, will allow these characteristics to be monitored as a function of the processing conditions.

What can optical emission spectra tell us about PEO plasma in individual discharges? Optical emission spectra arise from the decay of excited electronic states of atoms and molecules in a hot gas or plasma. Provided that the plasma is optically thin, the strength of the emission lines is proportional to the product of the number of atoms or molecules in the excited state and the constant rate at which these decay to the lower state. If the decay rate constants are known, the measured strength of the lines can be used to estimate the relative abundance of the excited states. The relative strength of lines from different molecular species gives information about the composition of the plasma, while the relative strength of lines from the same species gives information about temperature.

Primary investigator

Secondary investigator

Partners

  • University of Cambridge
  • University of Sheffield
  • Keronite Ltd.

Associated research group

  • Electronics and Electrical Engineering
Share this project FacebookTwitterWeibo

Date:
2012-2014
Themes:
Environmental modelling, Condition monitoring, High Voltage Engineering, Marine Energy
Funding:
National Grid plc

This project aims to develop tools for the rating and technical assessment of high power HVDC cable systems with mass impregnated insulation. The calculation of current ratings for DC cable is significantly more complex than that for AC cable. The rating is often determined by electric stress constraints rather than considerations of thermal ageing. Ratings are also strongly influenced by thermally induced pressure transients within the cable. In some cases the rating of the cable can be restricted by the cable being too cold, requiring different calculation approaches to AC cable where the main criterion is the maximum conductor temperature.

The project intends to develop a comprehensive framework for the rating of HVDC MI cable circuits, making use of techniques such as multiphysics modelling to examine the complex interactions of thermal and electrical stresses. The modelling of transient thermal conditions and the behaviour of the cable insulation under reversals of power flow will provide guidance for the development of dynamic rating algorithms and operational regimes suitable for high power HVDC cable circuits. Consideration will also be given to the effects of polarity reversals to ensure that the best use can be made of HVDC network links in the future.

Primary investigators

Secondary investigator

  • zh2g09

Partner

  • National Grid plc

Associated research group

  • Electronics and Computer Science
Share this project FacebookTwitterWeibo

Date:
2009-2012
Themes:
Solid dielectrics, Nanomaterials and Dielectrics

Epoxy resins have been used extensively as the dielectric and the mechanical support in solid insulation systems, such as electrical machines. Recently, thanks to the development of nanotechnology, epoxy nanocomposites have been expected to be potential candidates to replace the base resin. However, the effects of nano-fillers have been controversial, in both positive and negative ways. It is believed that the properties of nanocomposites are related to the surface chemistry of nano-fillers. The incorporation of nano-fillers with large interfacial areas into epoxy matrices may also modify the cure behaviour of the system, through introducing additional chemical reactions between moieties on the nano-filler surfaces and reactants thereby change the chemical balance of the original system. This project sets out to investigate the effects of stoichiometry and the nature of the interfacial areas of treated silica particles of various sizes on the properties of an epoxy-based system, and hence, provide a more comprehensive insight into the relationships between formulating reactants and incorporating fillers with the formed molecular architecture, which is associated with the end properties of products.

Primary investigators

  • Prof. Alun S Vaughan
  • Prof. Paul Lewin

Secondary investigator

  • vtn1g09

Partner

  • ABB Corporate Research

Associated research group

  • Electronics and Electrical Engineering
Share this project FacebookTwitterWeibo

Date:
2010-2015
Themes:
Energy Harvesting, Novel Sensors
Funding:
EPSRC (EP/I005323/1)

Smart fabrics and interactive textiles (SFIT) are defined as textiles that are able to sense stimuli from the environment and react or adapt to them in a predetermined way. For example, smart textiles/garments can incorporate sensors/actuators, processing and communications for use in applications such as health monitoring, consumer products and in the automotive sector. Smart fabrics and interactive textiles represent the next generation of fabrics and the potential opportunities for exploiting them are enormous. During recent involvement with the textiles community and talking in particular to developers of smart fabrics and intelligent clothing, it has become clear that a major obstacle towards integrating electronic functionality into fabrics is the portable power supply required. For example, whilst conductive tracks can be printed onto, or conductive yarns woven into, a fabric, the power supply for any integrated device is presently a standard battery. This requires conventional connection and must be repeatedly replaced and removed during washing. No matter how integrated the functionality of the fabric becomes, at present there is no alternative to powering the system using discrete batteries.

Energy harvesting (also known as energy scavenging) is concerned with the conversion of ambient energy present in the environment into electricity. Energy Harvesting is now a significant research topic with conferences such as PowerMEMS, IEEE MEMS, Transducers, DTIP and Eurosensors featuring at least one session on the subject. Energy harvesters do not have the energy density (energy stored for a given volume) of a battery but offer the attraction of an integrated power supply that will last the lifetime of the application and will not require recharging or replacement. This project will focus on harvesting energy from two sources: kinetic and thermal energy all of which have been identified as promising approaches for powering mobile electronics.

- Kinetic Energy Harvesting - For example, there is a large amount of kinetic energy available from human motion. Human motion characterised by large amplitude, low frequency movements that can also exert large forces. It has been estimated that 67W of energy are available in each step .

- Thermal Energy Harvesting - Harvesting of energy from heat sources (such as the human body) can be achieved by the conversion of thermal gradients to electrical energy using the Seebeck effect. There has been interest in the generation of power from body heat as a means to power wearable devices. For example Seiko have produced a wrist watch powered by body heat. Reported results for power densities achieved from micro-fabricated devices are 0.14 microW/mm^2 from a 700 mm^2 device for a temperature difference of 5 K, which is typically achievable for wearable applications.

The proposal involves using rapid printing processes and active printed inks to achieve energy harvesting fabrics. This will result in a low cost, easy to design, flexible and rapid way to realise energy harvesting textiles/garments. Both inkjet and screen printed are fully accepted processes widely used in the textile industry for depositing patterns. The proposed screen and inkjet printing processes have many benefits including low-cost, repeatability, flexibility, suitability for small/medium series and mass production, short development time, compatibility with a wide range of textiles and the capability of depositing a wide range of materials. The inks and associated printing parameters will be researched to enable the bespoke design and layout of the energy harvesting films in the application being addressed. The research will provide a toolbox of materials and processes suitable for a range of different fabrics that enable a user to develop the energy harvesting fabric best suited to their requirements.

Primary investigator

Secondary investigators

Partner

  • IFTH

Associated research group

  • Electronics and Electrical Engineering
Share this project FacebookTwitterWeibo

Date:
2011-2012
Theme:
Educational Enhancement
Funding:
JISC

The OpenMentor Technology Transfer (OMtetra) project addresses the JISC call in appropriately exploiting technology-enhanced assessment and feedback to enable more authentic and more useful feedback on assignment performance, thus improving assessment quality, enhancing the student experience, and supporting staff. It will do this by packaging the JISC-funded OpenMentor technology innovation of the Open University and supporting its transfer to two external institutions -the University of Southampton and King's College London - where it will be developed to address their identified needs to improve student feedback.

The potential impact of the OMtetra project is profound. There is currently no tool with the simple yet compelling "value proposition" of OpenMentor: to take a set of marked assignments, profile the feedback provided, and support the tutor in reflecting upon and improving that feedback.

The project will embed sustainability in a community of users, seeded at the originating institutions and reaching out to all interested practitioners both in the UK and world-wide, providing community access and tools to ensure the continued development and use of OpenMentor.

Primary investigator

Secondary investigators

  • Denise Whitelock
  • Stylianos Hatzipanagos
  • Stuart Watt
  • Paul Gillary
  • Pei Zhang
  • Alejandra Saucedo

Partners

  • Open University
  • King's College London

Associated research group

  • Electronic and Software Systems
Share this project FacebookTwitterWeibo

Cable Tunnel Ventilation Fans
Date:
2011-2012
Themes:
Modelling and Simulation, Environmental modelling, Condition monitoring
Funding:
National Grid plc

National Grid’s cable tunnel network is a vital component of the power transmission grid within the London area, with the length of cable installed in tunnels scheduled to increase significantly in the coming decades. In order to achieve the required circuit ratings for tunnel cable installations, a degree of forced cooling is necessary. Typically this is achieved through the installation of large ventilation fans to force air through the tunnel network, removing heat generated by the operation of the cable system. Where high circuit ratings are required, very large fans may be specified to provide the appropriate flow rate of cooling air through the tunnel. The maximum emergency rating of the cable circuit can be achieved through operating the fans on a 100% duty cycle, however the costs of such operation (in terms of the electrical power utilised) represent a significant contribution to the OPEX costs associated with the tunnel.

Building on the modelling foundation of the RoCiT Project, this project will investigate the feasibility of advanced ventilation control schemes which will be designed to ensure the minimisation of ventilation running costs while maintaining a high emergency rating on the cable circuits installed in the tunnel. This will involve the further development of thermal models of the tunnel network, along with the analysis of data from operational 400kV transmission circuits.

Primary investigators

Partner

  • National Grid plc

Associated research group

  • Electronics and Computer Science
Share this project FacebookTwitterWeibo

Date:
2011-2015
Themes:
High Voltage Engineering, Condition monitoring
Funding:
National Grid plc

In recent years a significant volume of research has been undertaken in order to understand the recent failures in oil insulated power apparatus due to deposition of copper sulphide on the conductors and in the insulation paper. Dibenzyl Disulfide (DBDS) has been found to be the leading corrosive sulphur compound in the insulation oil. The most commonly used mitigating technique for corrosive sulphur contaminated oil is passivation, normally using Irgamet 39 or 1, 2, 3-benzotriazole (BTA). The passivator is diluted into the oil where it then reacts with the copper conductors to form a complex layer around the copper, preventing it from interacting with DBDS compounds and forming copper sulphide. This research project will investigate the electrical properties of HV transformers which have tested positive for corrosive sulphur, and the evolution of those properties as the asset degrades due to sulphur corrosion. Parallel to this the long term properties of transformers with passivated insulation oil will be analysed in order to understand the passivator stability and whether it is necessary to keep adding the passivator to sustain its performance.

Primary investigators

Secondary investigator

  • pa3g08

Associated research group

  • Electronics and Computer Science
Share this project FacebookTwitterWeibo

Logo
Date:
2009-2014
Theme:
Modeling and Simulation
Funding:
EPSRC

The human brain remains as one of the great frontiers of science – how does this organ upon which we all depend so critically, actually do its job? A great deal is known about the underlying technology – the neuron – and we can observe in vivo brain activity on a number of scales through techniques such as magnetic resonance imaging, neural staining and invasive probing, but this knowledge - a tiny fraction of the information that is actually there - barely starts to tell us how the brain works, from a perspective that we can understand and manipulate. Something is happening at the intermediate levels of processing that we have yet to begin to understand, and the essence of the brain's information processing function probably lies in these intermediate levels. One way to get at these middle layers is to build models of very large systems of spiking neurons, with structures inspired by the increasingly detailed findings of neuroscience, in order to investigate the emergent behaviours, adaptability and fault-tolerance of those systems.

What has changed, and why could we not do this ten years ago? Multi-core processors are now established as the way forward on the desktop, and highly-parallel systems have been the norm for high-performance computing for a considerable time. In a surprisingly short space of time, industry has abandoned the exploitation of Moore’s Law through ever more complex uniprocessors, and is embracing a 'new' Moore's Law: the number of processor cores on a chip will double roughly every 18 months. If projected over the next 25 years this leads inevitably to the landmark of a million-core processor system. Why wait?

We are building a system containing a million ARM9 cores - not dissimilar to the processor found in many mobile phones. Whilst this is not, in any sense, a powerful core, it possesses aspects that make it ideal for an assembly of the type we are undertaking. With a million cores, we estimate we can sensibly simulate - in real time - the behaviour of a billion neurons. Whilst this is less than 1% of a human brain, in the taxonomy of brain sizes it is certainly not a primitive system, and it should be capable of displaying interesting behaviour.

A number of design axioms of the architecture are radically different to those of conventional computer systems - some would say they are downright heretical. The architecture turns out to be elegantly suited to a surprising number of application arenas, but the flagship application is neural simulation; neurobiology inspired the design.

This biological inspiration draws us to two parallel, synergistic directions of enquiry; significant progress in either direction will represent a major scientific breakthrough: • How can massively parallel computing resources accelerate our understanding of brain function? • How can our growing understanding of brain function point the way to more efficient parallel, fault-tolerant computation?

Primary investigators

  • Andrew Brown
  • Professor Steve Furber, University of Manchester
  • Dr Simon Moore, University of Cambridge

Partners

  • University of Manchester
  • University of Cambridge
  • University of Sheffield

Associated research groups

  • Electronic Systems and Devices Group
  • Electronics and Electrical Engineering
Share this project FacebookTwitterWeibo

Logo
Date:
2011-2013
Theme:
Knowledge Technologies
Funding:
Technology Strategy Board

The advent of new standards and initiatives for data publication in the context of the World Wide Web (in particular the move to linked data formats) has resulted in the availability of rich sources of information about the changing economic, geographic and socio-cultural landscape of the United Kingdom, and many other countries around the world. In order to exploit the latent potential of these linked data assets, we need to provide access to tools and technologies that enable data consumers to easily select, filter, manipulate, visualize, transform and communicate data in ways that are suited to specific decision-making processes.

In this project, we will enable organizations to press maximum value from the UK’s growing portfolio of linked data assets. In particular, we will develop a suite of software components that enables diverse organizations to rapidly assemble ‘goal-oriented’ linked data applications and data processing pipelines in order to enhance their awareness and understanding of the UK’s geographic, economic and socio-cultural landscape.

A specific goal for the project will be to support comparative and multi-perspective region-based analysis of UK linked data assets (this refers to an ability to manipulate data with respect to various geographic region overlays), and as part of this activity we will incorporate the results of recent experimental efforts which seek to extend the kind of geo-centred regional overlays that can be used for both analytic and navigational purposes. The technical outcomes of this project will lead to significant improvements in our ability to exploit large-scale linked datasets for the purposes of strategic decision-making.

RAGLD is a collaboative research initiative between the Ordnance Survey, Seme4 Ltd and the University of Southampton, and is funded in part by the Technology Strategy Board's “Harnessing Large and Diverse Sources of Data� programme. Commencing October 2011, the project runs for 18 months.

Primary investigator

Secondary investigator

Partners

  • Ordnance Survey
  • Seme4 Limited

Associated research group

  • Web and Internet Science
Share this project FacebookTwitterWeibo

Date:
2006-2009
Themes:
Complex Networks, Bio-inspired computing
Funding:
EPSRC

We are an interdisciplinary team of scientists working on an ambitious three-and-a-half year project titled "Spatially Embedded Complex Systems Engineering" (or SECSE). We are a research cluster spanning neuroscience, artificial intelligence, geography, and complex systems, brought together to understand the role of the spatial organization and spatial processes in complex networks within the domains of neural control, geo-information systems and distributed IT systems such as those implicated in air-traffic control. A key driver for the project is IT's current "network transition": from traditional systems comprising relatively isolated hierarchically organised computational elements to large-scale, massively interconnected systems that are physically distributed and affected by local conditions yet must remain secure, robust, and efficient. The project involves several world-class research groups in the U.K., and takes a highly interdisciplinary approach, bringing together experts in spatial processes, adaptive processes, biosystems and design processes, employing 6 post-doctoral researchers and involving two further doctoral research students.

Primary investigators

  • sgb
  • Dr E. A. Di Paolo
  • Dr D. Ladley

Secondary investigators

  • Dr C. L. Buckley
  • Dr L. Barnett
  • Dr P. Fine
  • Dr B. Clark

Associated research groups

  • Science and Engineering of Natural Systems Group
  • Agents, Interaction and Complexity
Share this project FacebookTwitterWeibo

Pages