Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kp no longer updating #105

Open
bharding512 opened this issue May 8, 2019 · 8 comments
Open

Kp no longer updating #105

bharding512 opened this issue May 8, 2019 · 8 comments

Comments

@bharding512
Copy link
Collaborator

It appears that the NGDC is no longer updating Kp. The latest update was May 2018. I sent an email to ask them what's up. I'll let you know what they say.
ftp://ftp.ngdc.noaa.gov/STP/GEOMAGNETIC_DATA/INDICES/KP_AP

I get the impression that GFZ Potsdam is becoming the new "true" source of Kp. We can use the data here, but unfortunately it's in a different format:
ftp://ftp.gfz-potsdam.de/pub/home/obs/kp-ap/tab/

@bharding512
Copy link
Collaborator Author

Justin Mabie from NOAA said the following:

We are no longer ingesting this data. The FTP link is still active but the file will not be updated. You can obtain it directly from the source. For KP that is GFZ in Potsdam and for F10.7 that is the Dominion Observatory.

So it looks like we have to switch to GFZ for Kp

@timduly4
Copy link
Owner

timduly4 commented May 9, 2019

Is there any documentation describing the ASCII files at GFZ Potsdam?

@mandrakos
Copy link

ftp://ftp.gfz-potsdam.de/pub/home/obs/kp-ap/tab/tab_fmt.txt
tab_fmt.txt

@bharding512
Copy link
Collaborator Author

I needed some 2019 Kp data so wrote this quick and dirty reader, which may or may not be useful for pyglow. Note that it requires pandas which pyglow does not currently require:

import pandas as pd

def read_gfz_kp(fn):
    ''' Read a GFZ Potsdam Kp .tab file and return as a pandas Series'''
    df = pd.read_csv(fn, header=None, skipfooter=4, engine='python')
    t = []
    kp = []
    for i in df.index:
        s = df.iloc[i,0].split()
        datestr = s[0]
        for j in range(8):
            kpstr = s[j+1]
            kpbase = int(kpstr[0])
            if kpstr[1] == 'o':
                kpfrac = 0.
            elif kpstr[1] == '+':
                kpfrac = 0.3
            elif kpstr[1] == '-':
                kpfrac = -0.3
            else:
                raise Exception('Kp fraction not understood: "%s"' % (kpstr[1]))
            kp.append(kpbase + kpfrac)
            # Middle of 3 hr interval
            t.append(pd.to_datetime(datestr, format='%y%m%d') + pd.to_timedelta(j*3 + 1.5, unit='H'))
    return pd.Series(index=t, data=kp)

@timduly4
Copy link
Owner

We can look into revamping the geophysical indice data structures in lieu of using arrays: https://github.com/timduly4/pyglow/blob/master/src/pyglow/generate_kpap.py#L25

@timduly4
Copy link
Owner

I like the idea of using pandas, too-- it would be trivial to add at: https://github.com/timduly4/pyglow/blob/master/requirements.txt

I created a ticket for revisiting parsing, storing, and accessing geophysical indices and can add it to my list: #108

@taiwoojotheophilus
Copy link

It appears that the NGDC is no longer updating Kp. The latest update was May 2018. I sent an email to ask them what's up. I'll let you know what they say.
ftp://ftp.ngdc.noaa.gov/STP/GEOMAGNETIC_DATA/INDICES/KP_AP

I get the impression that GFZ Potsdam is becoming the new "true" source of Kp. We can use the data here, but unfortunately it's in a different format:
ftp://ftp.gfz-potsdam.de/pub/home/obs/kp-ap/tab

I am trying to download some parameters for 2017 and 2018 in different geographical location from MSIS_00 model, however, I am receiving this error"DNET LOG ERROR NaN NaN 16.0000000", Could the issue raised above be the cause?

@bharding512
Copy link
Collaborator Author

PR#148 may have fixed this. Need to check.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants