databroker.Broker.get_table

Broker.get_table(headers, stream_name='primary', fields=None, fill=False, handler_registry=None, convert_times=True, timezone=None, localize_times=True)[source]

Load the data from one or more runs as a table (pandas.DataFrame).

Parameters
headersHeader or iterable of Headers

The headers to fetch the events for

stream_namestr, optional

Get events from only “event stream” with this name.

Default is ‘primary’

fieldsList[str], optional

whitelist of field names of interest; if None, all are returned

Default is None

fillbool or Iterable[str], optional

Which fields to fill. If True, fill all possible fields.

Each event will have the data filled for the intersection of it’s external keys and the fields requested filled.

Default is False

handler_registrydict, optional

mapping filestore specs (strings) to handlers (callable classes)

convert_timesbool, optional

Whether to convert times from float (seconds since 1970) to numpy datetime64, using pandas. True by default.

timezonestr, optional

e.g., ‘US/Eastern’; if None, use metadatastore configuration in self.mds.config[‘timezone’]

handler_registrydict, optional

mapping asset specs (strings) to handlers (callable classes)

localize_timesbool, optional

If the times should be localized to the ‘local’ time zone. If True (the default) the time stamps are converted to the localtime zone (as configure in mds).

This is problematic for several reasons:

  • apparent gaps or duplicate times around DST transitions

  • incompatibility with every other time stamp (which is in UTC)

however, this makes the dataframe repr look nicer

This implies convert_times.

Defaults to True to preserve back-compatibility.

Returns
tablepandas.DataFrame