Software Open Access
Gradient Algorithm
Published: Aug. 21, 2015. Version: 1.0.0
Please include the standard citation for PhysioNet:
(show more options)
Goldberger, A., Amaral, L., Glass, L., Hausdorff, J., Ivanov, P. C., Mark, R., ... & Stanley, H. E. (2000). PhysioBank, PhysioToolkit, and PhysioNet: Components of a new research resource for complex physiologic signals. Circulation [Online]. 101 (23), pp. e215–e220.
Abstract
In finding an optimal stimulus waveform for inducing switches in neuronal states, analytical techniques from optimal control theory are often found to be difficult to use or extremely time intensive. Here, we present the code for a gradient-based algorithm approach that has been used to find energetically optimal stimulus waveforms to trigger an action potential in the Hodgkin-Huxley model as well as initiating and repressing repetitive firing in the FitzHugh-Nagumo models. These two models serve just as examples and the code can be easily adapted to any other system.
Project description
This directory contains code and documentation for three applications of a stochastically-seeded gradient algorithm as described in the paper included. The gradient algorithm is used to solve optimization problems given a set of constraints and an optimization metric. We have applied the algorithm to three different neuronal applications:
- Triggering an action potential in the Hodgkin-Huxley model (an empirically validated ionic model of neuronal excitability)
- Initiating repetitive firing in the FitzHugh-Nagumo model (an abstract model applied to a wide range of biological systems that exhibit an oscillatory state and a quiescent state), and
- Suppressing repetitive firing in the FitzHugh-Nagumo model
We demonstrate in our study that this algorithm enables automated exploration of a wide solution space with stochastic seeding that allows us to find multiple locally optimal solutions. Furthermore, this algorithm is robust enough that no a priori knowledge of the optimal stimulus is necessary.
The code provided here are preset to run for these three applications, but they can be used as training tools to apply the gradient algorithm to any other application so desired.
Directory Contents
- Gradient Algorithm.pdf
- The paper which describes the work that we have done.
- Hodgkin-Huxley/
-
- gradAlg.m
- This is the main program to run.
- hh.m
- Function describe the Hodgkin-Huxley equations
- pInfluence.m
- Function describing the p influence equations
- RInfluence.m
- Function describing the R influence equations
- FitzHugh-Nagumo/Initiating Repetitive Firing/
-
- fhn.m
- Function describing the FitzHugh-Nagumo equations
- gradAlg.m
- This is the gradient algorithm for a single run with a given initial and terminal condition.
- optInOut.m
- This is the main program to run used to process every single run between the quiescent state and every single point on the repetitive firing limit cycle.
- outX.mat
- A set of 68 points defining the repetitive firing limit cycle.
- pInfluence.m
- Function describing the p influence equations
- RInfluence.m
- Function describing the R influence equations
- FitzHugh-Nagumo/Suppressing Repetitive Firing/
-
- fhn.m
- Function describing the FitzHugh-Nagumo equations
- gradAlg.m
- This is the gradient algorithm for a single run with a given initial and terminal condition.
- optOutIn.m
- This is the main program to run used to process every single run between the quiescent state and every single point on the repetitive firing limit cycle.
- outX.mat
- A set of 68 points defining the repetitive firing limit cycle.
- pInfluence.m
- Function describing the p influence equations
- RInfluence.m
- Function describing the R influence equations
Access
Access Policy:
Anyone can access the files, as long as they conform to the terms of the specified license.
License (for files):
Open Data Commons Attribution License v1.0
Discovery
Topics:
optimization
Corresponding Author
Files
Total uncompressed size: 2.4 MB.
Access the files
-
Download the files using your terminal:
wget -r -N -c -np https://physionet.org/files/gradient-algorithm/1.0.0/
-
Download the files using AWS command line tools:
aws s3 sync --no-sign-request s3://physionet-open/gradient-algorithm/1.0.0/ DESTINATION
Name | Size | Modified |
---|---|---|
Parent Directory | ||
Initiating-Repetitive-Firing | ||
Suppressing-Repetitive-Firing |