Background: The vast catalog of metabolites and extensive set of interactions within even
the simplest cells can present a daunting task for analyzing, and especially simulating, metabolic networks.
Yet many metabolites have associations with functionally related processes, convenient for organizing
metabolites into pathways. Any perturbation in concentration of a single metabolite involves
dynamic changes in concentrations of metabolites involved in a particular metabolic pathway and beyond.
A standard approach to model such changes is the use of sets of ordinary differential equations;
but because of the total number of dependent variables, a solution to such a set of equations is very
computationally intensive and carries significant error. There is also no guarantee that the solution is
always non-negative and numerically stable.
Objective: The main purpose of the study is to overcome the deficiencies of standard ODE modeling of
Methods: In this paper, we propose to model in-silico metabolic networks as queueing networks, similar
to the way that computer networks are modeled.
Results: The proposed approach builds on successful applications of queueing theory to model other biological
processes, such as gene delivery or molecular signaling. The basic mathematical underpinning
of the modeling method is introduced. This is followed by the description of a general building block
that can be used to model changes in concentration of a single metabolite and which can be interconnected
with the interacting metabolites modeled by similar blocks.
Conclusion: The presented simulation method has a potential to become a standard scalable method of
modeling metabolomics networks.