The Nuclear Spectrscopic Telescope ARray (NuSTAR) \cite{Harrison_2013} is a NASA Astrophysics Small Explorer observatory that launched in June of 2012. NuSTAR is composed of two co-aligned hard X-ray telescopes focused onto two focal plane arrays (hereafter FPMA and FPMB). We have previously described the time-dependent gain calibration of the detectors as a function of time \cite{Madsen_2015}. Here we describe an updated approach to the time-dependent gain and our process for validating our time-dependent gain corrections to determine if any additional corrections as a function of time are required.
The conversion of the pulse heights (PHAs) collected by the
NuSTAR detectors is a multi-stage process that relies on several underlying CALDB files (for a full description see the
NuSTAR data analysis software users guide). Several corrections were made to the CLC files after launch to account for differences in the gain for the FPMA detectors compared to pre-launch estimates using the in-flight
155-Eu calibration source. However, since the lines that are produced are predominantly at high energies (86.54 and 105.4 keV) with several blended X-ray lines at low energies (including strong lines at 6.06 and 6.71 keV. In
\cite{Madsen_2015} we used two epochs of the calibration source to determine the time-dependent change in the energy scale using the standard Xspec formalism, where the transfer function to the correct PI values are: