The need to feed an ever-increasing world population makes it obligatory to reduce the millions of tons of avoidable perishable waste along the food supply chain. A considerable share of these losses is caused by non-optimal cold chain processes and management. This Theme Issue focuses on technologies, models and applications to monitor changes in the product shelf life, defined as the time remaining until the quality of a food product drops below an acceptance limit, and to plan successive chain processes and logistics accordingly to uncover and prevent invisible or latent losses in product quality, especially following the first-expired-first-out strategy for optimized matching between the remaining shelf life and the expected transport duration. This introductory article summarizes the key findings of this Theme Issue, which brings together research study results from around the world to promote intelligent food logistics. The articles include three case studies on the cold chain for berries, bananas and meat and an overview of different post-harvest treatments. Further contributions focus on the required technical solutions, such as the wireless sensor and communication system for remote quality supervision, gas sensors to detect ethylene as an indicator of unwanted ripening and volatile components to indicate mould infections. The final section of this introduction discusses how improvements in food quality can be targeted by strategic changes in the food chain.
first-expired-first-out; food losses; food chain management; intelligent container; cold chain
Given a labelled tree T, our goal is to group repeating subtrees of T into equivalence classes with respect to their topologies and the node labels. We present an explicit, simple and time-optimal algorithm for solving this problem for unrooted unordered labelled trees and show that the running time of our method is linear with respect to the size of T. By unordered, we mean that the order of the adjacent nodes (children/neighbours) of any node of T is irrelevant. An unrooted tree T does not have a node that is designated as root and can also be referred to as an undirected tree. We show how the presented algorithm can easily be modified to operate on trees that do not satisfy some or any of the aforementioned assumptions on the tree structure; for instance, how it can be applied to rooted, ordered or unlabelled trees.
tree data structures; unrooted unordered labelled trees; subtree repeats
Bertram Hopkinson; Hopkinson bar; blast; impact; high strain rate
Composite sandwich materials have yet to be widely adopted in the construction of naval vessels despite their excellent strength-to-weight ratio and low radar return. One barrier to their wider use is our limited understanding of their performance when subjected to air blast. This paper focuses on this problem and specifically the strength remaining after damage caused during an explosion. Carbon-fibre-reinforced polymer (CFRP) composite skins on a styrene–acrylonitrile (SAN) polymer closed-cell foam core are the primary composite system evaluated. Glass-fibre-reinforced polymer (GFRP) composite skins were also included for comparison in a comparable sandwich configuration. Full-scale blast experiments were conducted, where 1.6×1.3 m sized panels were subjected to blast of a Hopkinson–Cranz scaled distance of 3.02 m kg−1/3, 100 kg TNT equivalent at a stand-off distance of 14 m. This explosive blast represents a surface blast threat, where the shockwave propagates in air towards the naval vessel. Hopkinson was the first to investigate the characteristics of this explosive air-blast pulse (Hopkinson 1948 Proc. R. Soc. Lond. A
89, 411–413 (doi:10.1098/rspa.1914.0008)). Further analysis is provided on the performance of the CFRP sandwich panel relative to the GFRP sandwich panel when subjected to blast loading through use of high-speed speckle strain mapping. After the blast events, the residual compressive load-bearing capacity is investigated experimentally, using appropriate loading conditions that an in-service vessel may have to sustain. Residual strength testing is well established for post-impact ballistic assessment, but there has been less research performed on the residual strength of sandwich composites after blast.
blast; composites; sandwich materials; compression after impact
Planetary science beyond the boundaries of our Solar System is today in its infancy. Until a couple of decades ago, the detailed investigation of the planetary properties was restricted to objects orbiting inside the Kuiper Belt. Today, we cannot ignore that the number of known planets has increased by two orders of magnitude nor that these planets resemble anything but the objects present in our own Solar System. Whether this fact is the result of a selection bias induced by the kind of techniques used to discover new planets—mainly radial velocity and transit—or simply the proof that the Solar System is a rarity in the Milky Way, we do not know yet. What is clear, though, is that the Solar System has failed to be the paradigm not only in our Galaxy but even ‘just’ in the solar neighbourhood. This finding, although unsettling, forces us to reconsider our knowledge of planets under a different light and perhaps question a few of the theoretical pillars on which we base our current ‘understanding’. The next decade will be critical to advance in what we should perhaps call Galactic planetary science. In this paper, I review highlights and pitfalls of our current knowledge of this topic and elaborate on how this knowledge might arguably evolve in the next decade. More critically, I identify what should be the mandatory scientific and technical steps to be taken in this fascinating journey of remote exploration of planets in our Galaxy.
exoplanets; atmospheric models; space missions
functional materials; transition metal oxides; organic semiconductors; graphene; mesoporous metal–organic frameworks
Recent advances on the Web have generated unprecedented opportunities for individuals around the world to assemble into teams. And yet, because of the Web, the nature of teams and how they are assembled has changed radically. Today, many teams are ad hoc, agile, distributed, transient entities that are assembled from a larger primordial network of relationships within virtual communities. These assemblages possess the potential to unleash the high levels of creativity and innovation necessary for productively addressing many of the daunting challenges confronting contemporary society. This article argues that Web science is particularly well suited to help us realize this potential by making a substantial interdisciplinary intellectual investment in (i) advancing theories that explain our socio-technical motivations to form teams, (ii) the development of new analytic methods and models to untangle the unique influences of these motivations on team assembly, (iii) harvesting, curating and leveraging the digital trace data offered by the Web to test our models, and (iv) implementing recommender systems that use insights gleaned from our richer theoretical understanding of the motivations that lead to effective team assembly.
virtual communities; networks; algorithms; teams; team assembly
This paper describes the basics of single-photon counting in complementary metal oxide semiconductors, through single-photon avalanche diodes (SPADs), and the making of miniaturized pixels with photon-counting capability based on SPADs. Some applications, which may take advantage of SPAD image sensors, are outlined, such as fluorescence-based microscopy, three-dimensional time-of-flight imaging and biomedical imaging, to name just a few. The paper focuses on architectures that are best suited to those applications and the trade-offs they generate. In this context, architectures are described that efficiently collect the output of single pixels when designed in large arrays. Off-chip readout circuit requirements are described for a variety of applications in physics, medicine and the life sciences. Owing to the dynamic nature of SPADs, designs featuring a large number of SPADs require careful analysis of the target application for an optimal use of silicon real estate and of limited readout bandwidth. The paper also describes the main trade-offs involved in architecting such chips and the solutions adopted with focus on scalability and miniaturization.
single-photon avalanche diode; avalanche photodiode; complementary metal oxide semiconductor
Nanosystems are large-scale integrated systems exploiting nanoelectronic devices. In this study, we consider double independent gate, vertically stacked nanowire field effect transistors (FETs) with gate-all-around structures and typical diameter of 20 nm. These devices, which we have successfully fabricated and evaluated, control the ambipolar behaviour of the nanostructure by selectively enabling one type of carriers. These transistors work as switches with electrically programmable polarity and thus realize an exclusive or operation. The intrinsic higher expressive power of these FETs, when compared with standard complementary metal oxide semiconductor technology, enables us to realize more efficient logic gates, which we organize as tiles to realize nanowire systems by regular arrays. This article surveys both the technology for double independent gate FETs as well as physical and logic design tools to realize digital systems with this fabrication technology.
nanosystems; nanoelectronics; nanowire transistors; controllable polarity; regular arrays; logic synthesis
We analyse the pros and cons of analog versus digital computation in living cells. Our analysis is based on fundamental laws of noise in gene and protein expression, which set limits on the energy, time, space, molecular count and part-count resources needed to compute at a given level of precision. We conclude that analog computation is significantly more efficient in its use of resources than deterministic digital computation even at relatively high levels of precision in the cell. Based on this analysis, we conclude that synthetic biology must use analog, collective analog, probabilistic and hybrid analog–digital computational approaches; otherwise, even relatively simple synthetic computations in cells such as addition will exceed energy and molecular-count budgets. We present schematics for efficiently representing analog DNA–protein computation in cells. Analog electronic flow in subthreshold transistors and analog molecular flux in chemical reactions obey Boltzmann exponential laws of thermodynamics and are described by astoundingly similar logarithmic electrochemical potentials. Therefore, cytomorphic circuits can help to map circuit designs between electronic and biochemical domains. We review recent work that uses positive-feedback linearization circuits to architect wide-dynamic-range logarithmic analog computation in Escherichia coli using three transcription factors, nearly two orders of magnitude more efficient in parts than prior digital implementations.
analog computation; synthetic biology; cytomorphic; logarithmic computation; probabilistic computation; bioenergetics
microelectronics; silicon photonics; sensors; microsystems
The past decades have seen density functional theory (DFT) evolve from a rising star in computational quantum chemistry to one of its major players. This Theme Issue, which comes half a century after the publication of the Hohenberg–Kohn theorems that laid the foundations of modern DFT, reviews progress and challenges in present-day DFT research. Rather than trying to be comprehensive, this Theme Issue attempts to give a flavour of selected aspects of DFT.
density functional theory; excited states; solid state; liquid state; NMR; EPR
A novel treatment of non-adiabatic couplings is proposed. The derivation is based on a theorem by Hunter stating that the wave function of the complete system of electrons and nuclei can be written, without approximation, as a Born–Oppenheimer (BO)-type product of a nuclear wave function, X(R), and an electronic one, ΦR(r), which depends parametrically on the nuclear configuration R. From the variational principle, we deduce formally exact equations for ΦR(r) and X(R). The algebraic structure of the exact nuclear equation coincides with the corresponding one in the adiabatic approximation. The electronic equation, however, contains terms not appearing in the adiabatic case, which couple the electronic and the nuclear wave functions and account for the electron–nuclear correlation beyond the BO level. It is proposed that these terms can be incorporated using an optimized local effective potential.
beyond Born–Oppenheimer; density functional theory; non-adiabatic effects
With the aim of clinical applications of X-ray phase imaging based on Talbot–Lau-type grating interferometry to joint diseases and breast cancer, machines employing a conventional X-ray generator have been developed and installed in hospitals. The machine operation especially for diagnosing rheumatoid arthritis is described, which relies on the fact that cartilage in finger joints can be depicted with a dose of several milligray. The palm of a volunteer observed with 19 s exposure (total scan time: 32 s) is reported with a depicted cartilage feature in joints. This machine is now dedicated for clinical research with patients.
grating interferometry; phase contrast; X-ray; clinics
Scanning electron microscopy (SEM) is used to evaluate potential chromosome preparations and staining methods for application in high-resolution three-dimensional X-ray imaging. Our starting point is optical fluorescence microscopy, the standard method for chromosomes, which only gives structural detail at the 200 nm scale. In principle, with suitable sample preparation protocols, including contrast enhancing staining, the surface structure of the chromosomes can be viewed at the 1 nm level by SEM. Here, we evaluate a heavy metal nucleic-acid-specific stain, which gives strong contrast in the backscattered electron signal. This study uses SEM to examine chromosomes prepared in different ways to establish a sample preparation protocol for X-rays. Secondary electron and backscattered electron signals are compared to evaluate the effectiveness of platinum-based stains used to enhance the contrast.
scanning electron microscopy; chromosomes; layer structure
A double event, supported as part of the Royal Society scientific meetings, was organized in February 2013 in London and at Chicheley Hall in Buckinghamshire by Dr A. Olivo and Prof. I. Robinson. The theme that joined the two events was the use of X-ray phase in novel imaging approaches, as opposed to conventional methods based on X-ray attenuation. The event in London, led by Olivo, addressed the main roadblocks that X-ray phase contrast imaging (XPCI) is encountering in terms of commercial translation, for clinical and industrial applications. The main driver behind this is the development of new approaches that enable XPCI, traditionally a synchrotron method, to be performed with conventional laboratory sources, thus opening the way to its deployment in clinics and industrial settings. The satellite meeting at Chicheley Hall, led by Robinson, focused on the new scientific developments that have recently emerged at specialized facilities such as third-generation synchrotrons and free-electron lasers, which enable the direct measurement of the phase shift induced by a sample from intensity measurements, typically in the far field. The two events were therefore highly complementary, in terms of covering both the more applied/translational and the blue-sky aspects of the use of phase in X-ray research. optics, image processing
Enhanced oil recovery (EOR) techniques can significantly extend global oil reserves once oil prices are high enough to make these techniques economic. Given a broad consensus that we have entered a period of supply constraints, operators can at last plan on the assumption that the oil price is likely to remain relatively high. This, coupled with the realization that new giant fields are becoming increasingly difficult to find, is creating the conditions for extensive deployment of EOR. This paper provides a comprehensive overview of the nature, status and prospects for EOR technologies. It explains why the average oil recovery factor worldwide is only between 20% and 40%, describes the factors that contribute to these low recoveries and indicates which of those factors EOR techniques can affect. The paper then summarizes the breadth of EOR processes, the history of their application and their current status. It introduces two new EOR technologies that are beginning to be deployed and which look set to enter mainstream application. Examples of existing EOR projects in the mature oil province of the North Sea are discussed. It concludes by summarizing the future opportunities for the development and deployment of EOR.
enhanced oil recovery; crude oil recovery; water injection; miscible gas; water alternating gas; chemical flooding
Abundant supplies of oil form the foundation of modern industrial economies, but the capacity to maintain and grow global supply is attracting increasing concern. Some commentators forecast a peak in the near future and a subsequent terminal decline in global oil production, while others highlight the recent growth in ‘tight oil’ production and the scope for developing unconventional resources. There are disagreements over the size, cost and recoverability of different resources, the technical and economic potential of different technologies, the contribution of different factors to market trends and the economic implications of reduced supply. Few debates are more important, more contentious, more wide-ranging or more confused. This paper summarizes the main concepts, terms, issues and evidence that are necessary to understand the ‘peak oil’ debate. These include: the origin, nature and classification of oil resources; the trends in oil production and discoveries; the typical production profiles of oil fields, basins and producing regions; the mechanisms underlying those profiles; the extent of depletion of conventional oil; the risk of an approaching peak in global production; and the potential of various mitigation options. The aim is to introduce the subject to non-specialist readers and provide a basis for the subsequent papers in this Theme Issue.
oil supply; energy; liquid fuels; alternative fuels; ‘peak oil’