Fortran Parameter Search

When the parameter space is low-dimensional and the model evaluation is cheap, brute-force grid search offers the simplest and most transparent calibration strategy. Fortran's raw loop performance makes it ideal for exhaustive sweeps over millions of parameter combinations.

1. Brute-Force Grid Search

Consider a model with \(d\) parameters, each discretised into \(N\) grid points. The total number of evaluations is:

$$N_{\text{eval}} = N^d$$

For \(d = 3\) parameters with \(N = 100\) grid points each, this is \(10^6\) evaluations—easily tractable if each evaluation takes microseconds (as in a simple CA step or analytic formula).

At each grid point \(\boldsymbol{\theta}_k\), we evaluate an objective (or “figure of merit”):

$$\text{FoM}(\boldsymbol{\theta}) = -\sum_{i=1}^{T}\bigl(N_i^{\text{obs}} - N_i^{\text{sim}}(\boldsymbol{\theta})\bigr)^2$$

The negative sum of squared residuals makes the best fit the maximum of FoM. Alternatively, one can use pattern-match metrics, Kappa statistics for spatial agreement, or fuzzy set similarity for comparing categorical maps.

2. Parallelisation Strategy

Grid search is embarrassingly parallel: each grid point is independent. In Fortran, OpenMP parallelism is added with a single directive:

!$OMP PARALLEL DO PRIVATE(i,j,k,r_val,K_val,sigma_val,fom) &
!$OMP            REDUCTION(MAX:best_fom)
do i = 1, Nr
  do j = 1, NK
    do k = 1, Nsigma
      r_val = r_min + (i-1) * dr
      K_val = K_min + (j-1) * dK
      sigma_val = sigma_min + (k-1) * dsigma
      fom = evaluate_model(r_val, K_val, sigma_val)
      if (fom > best_fom) then
        best_fom = fom
        best_r = r_val; best_K = K_val; best_sigma = sigma_val
      end if
    end do
  end do
end do
!$OMP END PARALLEL DO

The speedup is nearly linear in the number of cores. On a 64-core machine, the \(10^6\) evaluations complete in roughly \(10^6 / 64 \approx 15{,}000\) sequential evaluations worth of time.

Memory Considerations

Storing all FoM values requires \(N^d \times 8\) bytes (double precision). For \(100^3 = 10^6\) points, that is only 8 MB. For \(100^4\), it is 800 MB—still feasible on modern workstations, but one should consider streaming results to disk for larger searches.

3. Fortran Code: Parameter Search for Urban CA

The following Fortran program performs a 2D grid search over growth rate \(r\) and carrying capacity \(K\), evaluating a logistic model against synthetic observations. Results are written to a file for visualisation.

program parameter_search
  implicit none
  integer, parameter :: Nr = 80, NK = 80, Nobs = 40
  real(8) :: r_min, r_max, K_min, K_max, dr, dK
  real(8) :: r_val, K_val, N0, t, dt_obs
  real(8) :: N_pred, residual, fom, best_fom
  real(8) :: best_r, best_K
  real(8) :: obs(Nobs), t_obs(Nobs)
  real(8) :: fom_grid(Nr, NK)
  integer :: i, j, n

  ! True parameters for synthetic data
  real(8), parameter :: r_true = 0.05d0, K_true = 500000.d0
  real(8), parameter :: sigma = 15000.d0
  N0 = 50000.d0

  ! Generate synthetic observations
  dt_obs = 2.d0
  do n = 1, Nobs
    t_obs(n) = (n-1) * dt_obs
    obs(n) = K_true / (1.d0 + (K_true/N0 - 1.d0)*exp(-r_true*t_obs(n)))
    ! Add noise (simple LCG for reproducibility)
    obs(n) = obs(n) + sigma * 0.1d0 * sin(dble(n)*13.7d0)
  end do

  ! Grid bounds
  r_min = 0.01d0;  r_max = 0.10d0
  K_min = 200000.d0; K_max = 800000.d0
  dr = (r_max - r_min) / (Nr - 1)
  dK = (K_max - K_min) / (NK - 1)

  best_fom = -1.d30

  ! Nested loop search
  do i = 1, Nr
    r_val = r_min + (i-1)*dr
    do j = 1, NK
      K_val = K_min + (j-1)*dK
      ! Evaluate figure of merit (neg. sum of squared residuals)
      fom = 0.d0
      do n = 1, Nobs
        N_pred = K_val / (1.d0 + (K_val/N0 - 1.d0)*exp(-r_val*t_obs(n)))
        residual = obs(n) - N_pred
        fom = fom - residual*residual
      end do
      fom_grid(i,j) = fom
      if (fom > best_fom) then
        best_fom = fom
        best_r = r_val
        best_K = K_val
      end if
    end do
  end do

  ! Output results
  open(unit=10, file='param_search.dat', status='replace')
  do i = 1, Nr
    do j = 1, NK
      write(10, '(3E16.8)') r_min+(i-1)*dr, K_min+(j-1)*dK, fom_grid(i,j)
    end do
    write(10, *)  ! blank line for gnuplot
  end do
  close(10)

  write(*,'(A,F8.5,A,F12.1)') 'Best: r = ', best_r, '  K = ', best_K
  write(*,'(A,E14.6)') 'Best FoM = ', best_fom
end program parameter_search

4. Python Wrapper: Sensitivity Surface

We replicate the Fortran grid search in Python (for portability) and visualise the objective function landscape. The surface reveals parameter identifiability: a sharp peak means the parameters are well-determined; an elongated ridge signals correlation.

Parameter Sensitivity Surface for Urban Growth Calibration

Python
script.py77 lines

Click Run to execute the Python code

Code will be executed with Python 3 on the server

5. Grid Search vs. MCMC

CriterionGrid SearchMCMC
Scaling\(O(N^d)\) — curse of dimensionality\(O(n)\) per sample, independent of \(d\)
OutputFull FoM landscapePosterior samples
Best for1–3 parameters, cheap modelsAny dimensionality, expensive models
ResolutionFixed by grid spacingAdaptive (concentrates on high-probability regions)
ParallelismEmbarrassingly parallelRequires ensemble methods (e.g., parallel tempering)

Practical Recommendation

Use grid search for initial exploration (to understand the landscape shape and identify rough parameter bounds), then refine with MCMC for proper uncertainty quantification. The Fortran grid search is particularly valuable when the model itself is a Fortran simulation: the inner loop stays entirely in compiled code with no Python overhead.