In order to solve the problem we have considered an algorithm-centric approach uses active learning and semi-supervised learning.

The study group considered possibilities for a reduction of the O-negative blood usage in the Mid-West Area, where the current situation is particularly alarming. With the aim to quantify and minimize risks to patients in the Limerick region under a new regime in which less O-negative blood is stored at any one time, the study group explored the consequences of reducing the amount of O-negative blood stored at six local hospitals. The study group also explored the possibility of achieving a reduction of the inappropriate usage of O-negative blood by optimizing current routine practice and worked out practical recommendations for an improvement of O-negative blood management that can potentially reduce inappropriate use of O-negative blood to zero.

an environmentally-friendly method of removal is to use algae to clean this runoff water. The algae consume the minerals as part of their growth process. In addition to cleaning the water, the created algal bio-mass has a variety of applications including production of bio-diesel, animal feed, products for pharmaceutical and cosmetic purposes, or it can even be used as a source of heating or electricity.

The aim of this paper is to develop a model of algae production and use this model to investigate how best to optimize algae farms to satisfy the dual goals of maximizing growth and removing mineral contaminants.

With this aim in mind the paper is split into five main sections. In the first a review of the biological literature is undertaken with the aim of determining what factors effect the growth of algae. The second section contains a review of exciting mathematical models from the literature, and for each model a steady-state analysis is performed. Moreover, for each model the strengths and weaknesses are discussed in detail. In the third section, a new two-stage model for algae production is proposed, careful estimation of parameters is undertaken and numerical solutions are presented. In the next section, a new one-dimensional spatial-temporal model is presented, numerically

solved and optimization strategies are discussed. Finally, these elements are brought together and recommendations of how to continue are drawn.

a disease target. In a High Throughput Assay (HTA), each compound is

tested at a single concentration. In an IC50 test, a compound is tested

at a range of concentrations. Sometimes there are discrepancies between

the results of these tests and the Study Group was asked to model this.

Rather than assume normally distributed errors in the percentage effect,

the Study Group proposed a model in which there are also probabilities

p and q of a test erroneously indicating very low or high activity. The

parameters p and q can be estimated from the data, and then can be

used in the design of testing procedures.

1. Minimize the total number of deaths due to influenza.

2. Minimize the total number of infections with influenza.

3. Reduce the spread of resistance to antivirals.

It is understood that not all the objectives above might be satisfied at the same time, and the purpose of the work is to consider the outcome in the different scenarios. The aim of the present project is to see if optimal control theory can contribute to a better formulation of the treatment intensity, in order to bring the epidemic under control while avoiding wide-spread resistance in the population.

Our aim, however, is to design a device, a rotating cylindrical container having tubular lights attached to the walls inside, through which the waste is exposed to ultraviolet light as it gets rotated and moved towards the exit. The process will continue till the whole set is fed into the system. Such a device would be more effective as compared to batch processing types. The study group is asked to develop a mathematical model to analyse the effect of the number and location of tubes which will lead to maximal exposure during certain amount of time, which also needs an estimate, the sample will reside in the device before it gets discharged.

The problem is to estimate this error given various assumptions on the form of the distribution.

The analysis in this report gives upper bounds on the throughput if the Vitalab Flexor is operated in modes which are standard in the present situation. It is shown that a desired throughput of at least 266 tests per hour can not be realized on the basis of these standard operation modes. Possible improvements are suggested via so-called parallel or on-line operation modes, or a combination of these two modes. These possible improvements however require a number of changes in the technical design of the Vitalab Flexor.

The process of cell differentiation is thought to be important in the growth of blood vessels. Cells can sense that they are part of a blood vessel, and change their shape to form tubules. Also it is likely that they change their chemical messaging properties, and their abilities to bind to other endothelial cells.

A model is developed that describes cell differentiation, and separates cells into different classes. For simplicity the spatial distribution of cells in different classes is ignored. Using simple population dynamics, a set of coupled non-linear ODEs is developed to describe the dynamics of the system. The system is found to have two different long-time states, one corresponding to the formation of blood vessels and one where vessels did not form. The ratio of the cell proliferation rate to the cell maturity rate (the time it takes to realise that it is part of a blood vessel) is critical in determining which is the final state of the system.

To improve our models, future work should pursue data collection, empirical estimation of the model parameters, and examination of the underlying assumptions of our frameworks.

Further improvements could also include examining susceptibility to vCJD infection by age group and iatrogenic infections introduced through surgical instruments. Regarding the latter, it may be worthwhile to conduct experiments to quantify the transmission of prions from an infected surgical instrument after repeated sterilization procedures.

We pursued the following three approaches:

(i) the simulation of the time-harmonic linear elastic models to examine coarse scale effects and adhesion properties,

(ii) the investigation of a tri-phasic model, with the intent of upscaling this model to determine effects of electro-mechanical coupling between cells,

and (iii) the upscaling of a simple cell model as a framework for studying interface conditions at malignant cells.

Each of these approaches has opened exciting new directions of research that we plan to study in the future.

This study has allowed us to conclude the following:

1. A mediator is released from a single source cell.

2. The response to the mediator changes with distance.

3. The value of the apparent diffusion coeficient increases with distance.

4. A plausible proposed mechanism is that ATP is released and degrades to ADP.

5. Future experiments are required to confim that ATP is the mediator as suggested.

We will first briefly review two well-known sequence alignment approaches and provide a rudimentary improvement for implementation on parallel systems. Then, we carefully examin a unique sequencing technique known as the SOLiDTM System that can be implemented, and follow by the results from the global and local sequence alignment.

In this report, the team presents an explanation of the algorithms for color space sequence data from the high-throughput re-sequencing technology and a theoretical parallel approach to the dynamic programming method for global and local alignment. The combination of the di-base approach and dynamic programming provides a possible viewpoint for large-scale re-sequencing projects. We anticipate the use of distributed computing to be the next-generation engine for large-scale problems like such.

In the problem under consideration, the issue is one of wicking or leaking of the sample from the reaction reservoir to the waste region at elevated temperatures. A mechanism responsible for this phenomenon was thought to be the "wedge effect," which refers to the tendency of liquids to move along a sharp corner by capillary effects if the conditions are right. The analysis performed during the workshop also mainly focused on this effect.

While a definitive solution to this challenging problem posed in the workshop was not identified, it was felt that using a manufacturing process that can affect the corner angles in the channels may hold the most promise, allowing the wicking mechanism to be controlled without surface treatments that insert hydrophobic stops in the channel. For instance by "rounding" the side walls to increase the corner angles from 90 toward 180 degrees, the leaking of the sample away

from the reaction chamber might be delayed.

The top ranking results of our CyberT analysis on the data supplied by EcoArray are markedly different from those obtained earlier by EcoArray using the black box routines implemented in GeneSpring. The output of CyberT on the data are genes that have high differential expression and low variance among the replicates. This suggests that we have implemented a statistically robust method for identifying differentially expressed genes and used it successfully on the data. A further indicator that our method produces more accurate bio-markers is the following: the biomarkers chosen for Cd and Hg by our method have a high degree of overlap and this makes biochemical sense given the similarity between these two toxins - the output produced by GeneSpring did not have this additional signature of consistency.

This problem is concerned with a different microfluidic problem: delivering reactants to the site of reaction. A common setup is to attach syringes full of reactant to a reaction chamber by narrow hydrophobic tubing. Using a stepper motor, a controlled dose of liquid may be injected into the tube. The hydrophobosity causes the dose to curve outward on the sides, becoming a "slug" of reactant with air in front and behind. The syringe at the rear is then switched for one full of air, and air pressure is used to drive the slug to the reaction site.

If too much pressure is applied, the slug will arrive with a significant back pressure that will be relieved through bubbling in the reaction site. This causes the formation of a foam and is highly undesirable. We present a simple model based on Boyle’s law for the motion of a slug through a tube. We then extend this model for trains of slugs separated by air bubbles. Last, we consider the case of a flooded reaction site, where the forward air bubble must be pushed through the flooding liquid.

In conclusion, we have determined the dynamics of a single slug moving towards an empty reaction chamber giving the final equilibrium position of the slug. A phase-plane analysis then determined a condition on the size of the slug needed to ensure that it comes to rest without oscillating about the equilibrium position. The effect of a flooded reaction chamber was then considered. In this case it is impossible to avoid bubbling due to the design of the device. We found that it is possible, however, to reduce the bubbling by minimising the back pressure behind the slug. Finally, the dynamics of multiple slugs with or without a flooded reaction chamber has been investigated.

In this study group, we proposed two mathematical models to describe plaque growth and rupture.

The first model is a mechanical one that approximately treats the plaque as an inflating elastic balloon. In this model, the pressure inside the core increases and then decreases suggesting that plaque stabilization and prevention of rupture is possible.

The second model is a biochemical one that focuses on the role of MMPs in degrading the fibrous plaque cap. The cap stress, MMP concentration, plaque volume and cap thickness are coupled together in a system of phenomenological equations. The equations always predict an eventual rupture since the volume, stresses and MMP concentrations generally grow without bound. The main weakness of the model is that many of the important parameters that control the behavior of the plaque are unknown.

The two simple models suggested by this group could serve as a springboard for more realistic theoretical studies. But most importantly, we hope they will motivate more experimental work to quantify some of the important mechanical and biochemical properties of vulnerable plaques.

VR Technology is a leading supplier of technical dive computers. The company is interested in expanding upon an existing algorithm (the Variable Gradient Model - VGM), which is used to design ascent profiles/decompression schedules and thereby mitigate the risk of decompression sickness in divers.

The Study Group took the approach of trying to extend the existing Haldane model to account more explicitly for the formation of bubbles. By extending the model to include bubble dynamics it was expected that some physical understanding could be gained for the existing modifications to some of the parameters. The modelling that occurred consisted of first looking at the Haldane model and then considering a single small isolated bubble in each of the compartments and interpreting the predictions of the model in terms of decompression profiles.

In these circumstances, Dstl wish to have mathematical models that give an understanding of the process, and can be used to choose the parameters to give adequate removal of the contaminant. Mathematical models of this have been developed and analysed, and show results in broad agreement with the effects seen in experiments.

1. Why does the double exponential fit better than the single exponential?

2. Why does the time at which you terminate the data alter the predictions?

3. Is there a more robust way to fit the data than is currently being used?

4. Does the effect of transfer between the mobile molecules and the bound ones explain the difference between the two types of graphs?

This will lower the horizontal friction, but may also bring about surface contact in high load situations.

For the AFTS, a movement path is specified, translated into platform coordinates and executed on the machine. During the execution, the load cell measures the forces and moments that act on the prosthetic foot. We wish to find the particular movement path of the Stewart platform that will generate the target force profile. Thus, we are interested in solving an inverse problem. The main goal of the workshop was to investigate potential solution methods for this ‘force-control’ problem, including looking into its feasibility.

The report begins with a description of the data collection, followed by a description of the data processing required to align two back surfaces. A section is devoted to calculating the cosmetic score, a measure of deformity of the back. The paper concludes with a few suggestions for improvements on data collection and use.

We provide an analysis that facilitates counting 3 and 4 centre pharmacophores, including a mathematical model for distance interval ratios, triangle and other inequality requirements for feasible triangles and tetrahedra, and symmetries.

Beside spatial symmetries and and distance similarities for each edge of the polyhedra, there does not appear to be any other relevant structural similarity feature between two pharmacophores that can be used to reduce the classification of a typical compound.

The second concerned the use of comparative genomics in understanding and comparing metabolic networks in bacterium. Comparative genomics is a method to make inferences on the genome of a new organism using information of a previously charaterised organism. The first mathematical question is how one would quantify such a metabolic map in a statistical sense, in particular, where there are different levels of confidence for presense of different parts of the map. The next and most important question is how one can design a measurement strategy to maximise the confidence in the accuracy of the metabolic map.