alert - warning

This page has not been translated into Kirundi. Visit the Kirundi page for resources in that language.

OpenFEMA Developer Resources

Welcome to the OpenFEMA Developer Resources page, devoted to providing additional development information regarding our Application Programming Interface (API) for use in your applications and mashups.  The API is free of charge and does not currently have user registration requirements.  Please contact the OpenFEMA Team at openfema@fema.dhs.gov  to suggest additional datasets and additional API features.

Please review the API Documentation for list of commands that can be used for each endpoint. As OpenFEMA’s main purpose is to act as a content delivery mechanism, each endpoint represents a dataset. Therefore, the documentation does not outline each one; they all operate in the same manner. Metadata (content descriptions, update frequency, data dictionary, etc.) for each data set can be found on the individual data set pages. The Data Sets page provides a list of the available endpoints.

The Changelog identifies new, changing, and deprecated datasets, and describes new features to the API.

The  API Specifics/Technical portion of the FAQ may be of particular interest to developers.

The Large Data Set Guide provides recommendations and techniques for working with OpenFEMA's large data files. Some code examples are included.

Following are examples or recipes of commonly performed actions - many expressed in different programming or scripting languages. We will continue to expand this section. If you have code examples you would like to see, please contact the OpenFEMATeam. We also welcome any code examples you would like to provide.

alert - info

OpenFEMA provides sample code and examples on GitHub! Please visit github.com/FEMA

Accessing Data from API Endpoint

There are many ways to access data from the OpenFEMA API such as using a programming language, scripting language, or some built-in command. The following examples demonstrate how to get data using an OpenFEMA API endpoint. All of these examples will return disaster summaries for Hurricane Isabell (disaster number 1491).

alert - warning

Note that not all the data may be returned. By default only 1,000 records are returned. If more data exists, it will be necessary to page through the data to capture it all. See the API Documentation and GitHub paging examples for more information.

HTTP/URL – Paste in your browsers URL.

https://www.fema.gov/api/open/v2/DisasterDeclarationsSummaries?$filter=disasterNumber eq 1491

cURL – Saving returned data to a file. Note URL %20 encoding used for spaces.

curl 'https://www.fema.gov/api/open/v2/DisasterDeclarationsSummaries?$filter=disasterNumber%20eq%201491' >> output.txt

wget – Saving returned data to a file.

wget –O output.txt 'https://www.fema.gov/api/open/v2/DisasterDeclarationsSummaries?$filter=disasterNumber%20eq%201491'

Windows PowerShell 3.0 – Note site security uses TLS 1.2, therefore the security protocol must be set first.

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
Invoke-WebRequest -Uri https://www.fema.gov/api/open/v2/DisasterDeclarationsSummaries?$filter=disasterNumber%20eq%201491 –OutFile c:\temp\output.txt

Paging Through Data

For performance reasons, only 1,000 records are returned per API endpoint call by default (i.e., if no value is set for $top). The maximum number of records that can be returned per call is 10,000 (i.e., $top=10000). If more than 10,000 records exist for the specified filter, it will be necessary to page through the data, using the $skip and $inlinecount parameters to retrieve every record. The metadata header returned as part of the dataset JSON response will only display the full record count if the $inlinecount parameter is used—otherwise, it will have a value of 0. Computer code containing a loop is written to continue making API calls, incrementing the $skip parameter accordingly, until the number of records retrieved equals the total record count. See the OpenFEMA Documentation, URI commands section for additional information regarding these parameters.

NOTE: Although a few of the examples provided download CSV files, it is recommended that results be downloaded in a JSON format. This format is native to the OpenFEMA data store, as such data need not be converted by the server thereby improving download performance. Further, when using CSV there is no guarantee that the record order will be maintained (this a very unlikely event however, that we have been unable to produce in tests.)

alert - info

Please see our GitHub OpenFEMA Samples repository for download/paging examples in various languages including Bash, Python, R, JavaScript, and more.

Other Common Code Examples to be Added Soon

  • Downloading full files
  • Periodic updates instead of full downloads
  • Checking for dataset data updates
  • Converting JSON to a different format
  • Working with different time formats
  • Using the metadata endpoints

IPAWS Archived Alerts Query Examples

The Integrated Public Alert and Warning System (IPAWS) Archived Alerts data set is unique among OpenFEMA datasets in that the information is hierarchical in nature. Performing searches through the API can be challenging and the ability to filter, search, and sort IPAWS data is limited. In most cases it will be necessary to first download a subset based on a filter to limit data to a region or date, and then post-process offline with external tools.

OpenFEMA generally uses utilities and tools built into the Linux operating system. For example, once downloaded a utility called jq can be used to extract and manipulate JSON data and even extract it into a CSV file. This has to be done with care however, because the hierarchical nature of CAP messages can introduce duplicate records in the results.

alert - info

Please see our GitHub OpenFEMA Dataset Examples repository for specific examples of IPAWS queries. Included are examples on basic filtering by a variety of fields, retrieving data by state, retrieving data by IPAWS event code, retrieving COVID-19 specific data, and executing geospatial queries.