Created
March 10, 2026 13:36
-
-
Save bmorris3/98f46130ecd0f37f2ad1dc14f327db93 to your computer and use it in GitHub Desktop.
How to open, blaze normalize, and apply barycentric corrections to OWLS HLSPs (https://archive.stsci.edu/hlsp/owls).
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| { | |
| "cells": [ | |
| { | |
| "cell_type": "markdown", | |
| "id": "7a41531f-008d-4f3e-8d92-7727bd941391", | |
| "metadata": {}, | |
| "source": [ | |
| "# OWLS HLSPs: barycentric correction and continuum normalization\n", | |
| "\n", | |
| "### Brett Morris" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "5ef0b3a4-abb5-461a-9584-c6bf22f19982", | |
| "metadata": {}, | |
| "source": [ | |
| "First let's install the APO echelle spectrum reduction toolkit called `aesop` ([paper](https://ui.adsabs.harvard.edu/abs/2018JOSS....3..854M/abstract), [github](https://github.com/bmorris3/aesop/), [docs](https://arces.readthedocs.io/en/latest/)):" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "91ee5fa5-befd-48a4-9d91-ece77b7a098c", | |
| "metadata": {}, | |
| "outputs": [], | |
| "source": [ | |
| "!pip install aesop-arces" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "911e15f1-10f9-42de-a995-f7767f402929", | |
| "metadata": {}, | |
| "source": [ | |
| "Then the imports:" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "08a28fb9-0ce8-4cbb-83c8-53fd439ca537", | |
| "metadata": {}, | |
| "outputs": [], | |
| "source": [ | |
| "import numpy as np\n", | |
| "import matplotlib.pyplot as plt\n", | |
| "\n", | |
| "import astropy.units as u\n", | |
| "from astropy.coordinates import SkyCoord, EarthLocation\n", | |
| "from astropy.time import Time\n", | |
| "from astropy.io import fits\n", | |
| "\n", | |
| "from astroquery.mast import Observations\n", | |
| "\n", | |
| "from aesop import EchelleSpectrum, Spectrum1D" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "782682b6-65e6-430a-8010-b04357ce6976", | |
| "metadata": {}, | |
| "source": [ | |
| "Let's check the available OWLS observations from MAST:" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "1ed972be-3da4-4611-a3cd-d4c1d43b8335", | |
| "metadata": {}, | |
| "outputs": [], | |
| "source": [ | |
| "# Search for all available OWLS products\n", | |
| "all_obs = Observations.query_criteria(provenance_name=\"owls\")\n", | |
| "data_products = Observations.get_product_list(all_obs)\n", | |
| "data_products.add_index('obs_id')\n", | |
| "\n", | |
| "# show me a selection:\n", | |
| "data_products[130:150]" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "0b0e8da6-a6f0-437d-866a-a4ae38845467", | |
| "metadata": {}, | |
| "source": [ | |
| "## Download the standard star\n", | |
| "\n", | |
| "Download the standard star's spectrum (HZ 44):" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "1e2c9473-bc7c-4c6b-9ed1-f593f0ab5a5d", | |
| "metadata": {}, | |
| "outputs": [], | |
| "source": [ | |
| "# this is the one standard spectrum included in the OWLS HLSPs\n", | |
| "standard_star_obs_id = 'hlsp_owls_apo_arces_hz44-2024-06-19-03-34_clear'\n", | |
| "standard_star_product = data_products.loc[standard_star_obs_id]\n", | |
| "standard_star_path = Observations.download_products(standard_star_product)['Local Path'][0]" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "4743b95e-7ac9-4e79-bed9-a55f93df9b1d", | |
| "metadata": {}, | |
| "source": [ | |
| "## Download a target star\n", | |
| "\n", | |
| "Choose an observation from the table above, then enter it's `obs_id` below to download. I'm choosing GJ 182 because it's a high SNR spectrum with obvious H$\\alpha$ emission:" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "3ee548a6-cd27-45d3-881b-7b7dae3cff8e", | |
| "metadata": {}, | |
| "outputs": [], | |
| "source": [ | |
| "target_star_obs_id = 'hlsp_owls_apo_arces_gj182-2023-02-04-02-42_clear'\n", | |
| "target_star_product = data_products.loc[target_star_obs_id]\n", | |
| "target_star_path = Observations.download_products(\n", | |
| " target_star_product\n", | |
| ")['Local Path'][0]" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "74fc134c-2c27-42aa-8d68-5b7f52cdda4f", | |
| "metadata": {}, | |
| "source": [ | |
| "## Load the FITS files as `EchelleSpectrum` objects\n", | |
| "\n", | |
| "Below we will parse the HLSP files and load them [as `aesop.EchelleSpectrum` objects](https://arces.readthedocs.io/en/latest/api/aesop.EchelleSpectrum.html)" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "457ad31f-0223-4a29-86da-732a7a085c1b", | |
| "metadata": {}, | |
| "outputs": [], | |
| "source": [ | |
| "def hlsp_to_echelle_spectrum(hlsp_fits_path):\n", | |
| " \"\"\"Open OWLS HLSP, return EchelleSpectrum\"\"\"\n", | |
| " spectrum_fits = fits.open(hlsp_fits_path)\n", | |
| "\n", | |
| " wavelength_unit = u.Unit(spectrum_fits[1].header['TUNIT1'])\n", | |
| " flux_unit = u.Unit(spectrum_fits[1].header['TUNIT2'])\n", | |
| " \n", | |
| " spectrum_list = [\n", | |
| " # load HLSP files from BinTableHDUs\n", | |
| " Spectrum1D.from_array(\n", | |
| " wavelength=spec.data['WAVELENGTH'] * wavelength_unit, \n", | |
| " flux=spec.data['FLUX'] * wavelength_unit\n", | |
| " )\n", | |
| " # skip primary HDU:\n", | |
| " for spec in spectrum_fits[1:]\n", | |
| " ]\n", | |
| " echelle_spec = EchelleSpectrum(\n", | |
| " spectrum_list, spectrum_fits[0].header\n", | |
| " )\n", | |
| " return echelle_spec\n", | |
| "\n", | |
| "standard_spectrum = hlsp_to_echelle_spectrum(standard_star_path)\n", | |
| "target_spectrum = hlsp_to_echelle_spectrum(target_star_path)" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "0eca42dd-1da6-4462-8486-27621091eee7", | |
| "metadata": {}, | |
| "source": [ | |
| "#### Quick look\n", | |
| "\n", | |
| "These spectra don't have barcentric corrections or blaze function normalization. Let's see:" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "08293d4a-ae2b-4a85-9074-05ad624bfed2", | |
| "metadata": {}, | |
| "outputs": [], | |
| "source": [ | |
| "# EchelleSpectrum objects can be indexed by their echelle order, \n", | |
| "# we'll plot the order containing H alpha:\n", | |
| "h_alpha_order = 74\n", | |
| "\n", | |
| "fig, ax = plt.subplots(1, 2, figsize=(12, 5))\n", | |
| "target_spectrum[h_alpha_order].plot(ax=ax[0])\n", | |
| "\n", | |
| "target_spectrum[h_alpha_order].plot(ax=ax[1])\n", | |
| "ax[1].axvline(6562.8, ls='--', color='red')\n", | |
| "ax[1].set(\n", | |
| " xlim=[6560, 6566],\n", | |
| " # ylim=[800, 1200]\n", | |
| ")" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "7b4de976-a93b-4700-bc2b-9950baf45b4a", | |
| "metadata": {}, | |
| "source": [ | |
| "On the left, we see that the blaze function is not corrected. On the right, we see that the H$\\alpha$ feature is offset from where we'd expect it to be, due to Earth's barycentric velocity and the stellar radial velocity.\n", | |
| "\n", | |
| "\n", | |
| "## Continuum normalization\n", | |
| "\n", | |
| "Fit each of the standard star's spectral orders with a fifth order polynomial, then remove that polynomial from each of the target star's spectral orders.\n", | |
| "\n", | |
| "This works best for orders that _don't_ contain broad absorption lines in the standard star's spectrum, but it still works pretty well here:" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "8f4a22af-7c14-4e05-a91f-bc730d6cac07", | |
| "metadata": {}, | |
| "outputs": [], | |
| "source": [ | |
| "target_spectrum.continuum_normalize_from_standard(\n", | |
| " standard_spectrum, polynomial_order=5\n", | |
| ")\n", | |
| "\n", | |
| "target_spectrum[h_alpha_order].plot()" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "6cb00426-6e71-4b8c-b837-788ccb2f7a4e", | |
| "metadata": {}, | |
| "source": [ | |
| "The normalization isn't perfect because the standard has H$\\alpha$ absorption. You can further normalize the spectrum with `target_spectrum.continuum_normalize_lstsq` ([see docs](https://arces.readthedocs.io/en/latest/api/aesop.EchelleSpectrum.html#aesop.EchelleSpectrum.continuum_normalize_lstsq))." | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "c2c4f6b2-fcf6-44fd-b4ae-a454e721bbbe", | |
| "metadata": {}, | |
| "source": [ | |
| "## Barycentric correction\n", | |
| "\n", | |
| "Now let's apply barycentric corrections to each order of the target spectrum:" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "d7babbbd-6d71-487e-b08e-b34e6b1c930d", | |
| "metadata": {}, | |
| "outputs": [], | |
| "source": [ | |
| "datetime = target_spectrum.header['DATE-OBS']\n", | |
| "time = Time(datetime, format='isot')\n", | |
| "skycoord = SkyCoord(\n", | |
| " ra=target_spectrum.header['RA_TARG'],\n", | |
| " dec=target_spectrum.header['DEC_TARG'],\n", | |
| " unit=(u.hourangle, u.deg)\n", | |
| ")\n", | |
| "\n", | |
| "# calculate and correct for BERV\n", | |
| "berv = target_spectrum.barycentric_correction(\n", | |
| " time=time, skycoord=skycoord, location=EarthLocation.of_site(\"APO\")\n", | |
| ")" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "9147de7a-a59c-4126-a0c7-e730182cfbf9", | |
| "metadata": {}, | |
| "source": [ | |
| "Now let's plot H$\\alpha$ and check that BERV was removed:" | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "6f692392-7666-41a5-a47f-56db8f1bff8a", | |
| "metadata": {}, | |
| "outputs": [], | |
| "source": [ | |
| "fig, ax = plt.subplots()\n", | |
| "target_spectrum[h_alpha_order].plot(ax=ax)\n", | |
| "ax.axvline(6562.8, ls='--', color='red')\n", | |
| "ax.set(\n", | |
| " xlim=[6560, 6566],\n", | |
| ")" | |
| ] | |
| }, | |
| { | |
| "cell_type": "markdown", | |
| "id": "7b9da619-e479-4951-800a-9f5b60ed5ed6", | |
| "metadata": {}, | |
| "source": [ | |
| "Remaining wavelength offsets are due to the target's radial velocity.\n", | |
| "\n", | |
| "Check out [other methods in aesop](https://arces.readthedocs.io/en/latest/api/aesop.EchelleSpectrum.html#aesop.EchelleSpectrum.rv_wavelength_shift_ransac) to measure the radial velocity by cross-correlating with a template spectrum (works best for FGK stars)." | |
| ] | |
| }, | |
| { | |
| "cell_type": "code", | |
| "execution_count": null, | |
| "id": "d3ae2fc2-ce41-41a3-a81d-3742bc6210ac", | |
| "metadata": {}, | |
| "outputs": [], | |
| "source": [] | |
| } | |
| ], | |
| "metadata": { | |
| "kernelspec": { | |
| "display_name": "Python 3 (ipykernel)", | |
| "language": "python", | |
| "name": "python3" | |
| }, | |
| "language_info": { | |
| "codemirror_mode": { | |
| "name": "ipython", | |
| "version": 3 | |
| }, | |
| "file_extension": ".py", | |
| "mimetype": "text/x-python", | |
| "name": "python", | |
| "nbconvert_exporter": "python", | |
| "pygments_lexer": "ipython3", | |
| "version": "3.12.12" | |
| } | |
| }, | |
| "nbformat": 4, | |
| "nbformat_minor": 5 | |
| } |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment