The Significance of Fine-Tuning the Model

In document Modeling complex cell regulation in the zebrafish circadian clock (Page 99-102)

Chapter 2: Creating the Core Model and Analytical Tools

2.1 Building an Initial Model of the Zebrafish Circadian Clock

2.1.2 The Significance of Fine-Tuning the Model

It has been described for a wide variety of areas, ranging from theoretical physics to economics and biochemical systems, that models may have to be fine-tuned, that means their parameters adjusted very precisely, in order to bring their output in line with observations. While there is some ongoing discussion about the underlying origin and justification of this necessity, with different sides contributing arguments drawing on various naturalistic and anthropic principles, at least in the case of biological systems an explanation is more readily apparent. After all, it has been described how individual processes may, via dynamics represented by Michaelis Menten or Hill kinetics, be very sensitive to concentration changes around a very specific range; and evidently the need to predict these threshold values becomes all

100

the more acute, when representing the emergent behaviour of a system relying on several of such hypersensitive elements. For instance, it was hypothesized in the discussion of the preliminary results, how even a modest misestimation of initial values could have given rise to an uncharacteristically strong first circadian cycle, with subsequent ones experiencing a significant signal decay. This possibility is especially noteworthy when considering that the predefined initial values are only directly utilized by the numerical solver in the very first one, or few depending on the algorithm, of many hundred seemingly affected time step calculations. Of course, the significance of the equation parameters referenced at every single calculation step would arguably be much greater still, highlighting that slight initial dislocations may not only persist, but actually be propagated and inflated over time. Consequently, already small variations in parameter values can give rise to significant behavioural variability, pointing to the need for a thorough and methodical parameter estimation procedure when evaluating the dynamics and merit of a particular model. Moreover, considering next the obviously beckoning questions of precisely what quality to optimize parameters for, the task of creating a fit with experimental data can be surprisingly multifaceted. After all, a "simple" readout could be broken down into dozens of primary observable features, which may be further refined by statistical analysis into a plethora of system properties, such as maximum or minimum values interpreted as either absolutes or as standard deviations, average values understood as either means or medians, signal intensities measured as peak amplitudes or as integrals of signal strength over time, etc.

Summary Statistics for the Circadian Clock

At least in the case of the circadian clock, thankfully several main descriptors are relatively well established, and these include the periods, phase relationships and peak amplitudes of, and between, the oscillating concentrations of different core clock proteins and mRNA. Arrays of data exist for various experimental setups and measurement routines recording these values, and while not necessarily perfectly consistent, this trove proves very valuable for training model systems. Nevertheless, it can still be

101

worthwhile to additionally implement experimental runs especially attuned to particular aspects of the simulation objectives. Not only does this provide an opportunity to generate very specific conditions and readouts, which may uniquely boost the development and hence accuracy of the desired model properties, but moreover, experimental data thusly generated may also help to bring into better perspective the experimental frameworks and readout conventions employed elsewhere. For instance, it may be found that lengthy exposure to constant darkness may affect the subsequent observable behaviour under different light regimes, and insights such as this one may consequently help to select and sort existing data sets so as to better compare "like with like". Yet another, critically important aspect for testing simulation outputs with results from laboratory experiments refers to the quantification of data. While there are instances where an intuitive description may be potent in sorting for important qualities and/or may be difficult to supersede quantitatively, e.g. when sorting curves by a complex shape or pattern, it is generally considered more rigorous, reliable, and reproducible to express results along a numerical spectrum. Even in instances where this quality is poorly provided by raw experimental readouts, it is often possible to utilize mathematical techniques to extract from even relatively complex data certain key summary statistics, which may then be readily compared and contrasted across different cohorts, conditions, or investigative settings. In the context of analyzing oscillation, and indeed a wide variety of signals, various signal transforms are frequently employed, including for instance Fourier, Hilbert, or Wavelet transforms, to decompose the signal and readily extract different underlying qualities, most importantly periods, phase, or amplitude. It can be further noted, that the techniques can in principle be applied to simulated data as easily as to laboratory results, opening up the perspective of largely automated programmed functions that reference key dynamics of the model system to established observations. In addition, techniques such as sensitivity and bifurcation analysis can be employed on this basis in silico, monitoring the effect on key statistical values of cycling one or several parameters across a possibly wide parameter space. Once again, these approaches can provide a fresh and important

102

outlook on the viable range of parameter values, but work best when being centered on an evidently functional system as a "gold standard".

Capturing Underlying Dynamics

Finally, having glanced over different challenges and approaches for fitting the model behaviour, it should of course be pointed out that this goal is not intended as an end in itself, but rather serves to evaluate and establish the validity of the suggested underlying dynamics. Once these interaction steps are accepted as sufficiently faithful representations of the system behaviour under scrutiny, the advantages of the modeling approach truly begin to shine, allowing easy manipulation of the system over a wide range of simulated condition, on vastly accelerated or decelerated time scales, and all while providing life readouts of not only phenotypic characteristics, but also the dynamics themselves. Once again, it is hoped that such a finely attuned system would permit novel insights into the nature of entrainment in the Zebrafish Circadian Clock and GRNs more generally. The following sections will provide a description of the laboratory experiments carried out as part of this project, as well as of the program tools that are being implemented to analyse and compare corresponding results.

In document Modeling complex cell regulation in the zebrafish circadian clock (Page 99-102)