Duplicate incident ids from API

Hi all,

Recently started using the API integration to generate incidents. The process is pretty simple:

  1. [Custom code] Get a list of stuff that has issues (each item has an unique identifier used as the incident_key)
  2. [API code] Get a list of all incidents (id+incident_key) using GET https://api.pagerduty.com/incidents
  3. [API code] For any incident_key in 1 but not in 2 create new incident using POST https://api.pagerduty.com/incidents

The problem I am seeing is after running this for a bit I start to get duplicate incident ids from 2 and missing incident_keys which manifest in Open incident with matching dedup key already exists on this service

Here is a snippet the python code I am using to get a list of all incidents:

more = True
offset = 0
limit = 100
while more:
    querystring = {"service_ids[]": "something", "offset": offset, "limit": limit}

    resp = requests.request("GET", "https://api.pagerduty.com/incidents", headers=headers, params=querystring)
    if resp.status_code != 200:

    resp_json = resp.json()

    for i in resp_json["incidents"]:
        if i["status"] != "resolved":
            if i["incident_key"] in ret:
                print("Duplicate id {} {}".format(i["id"], i["incident_key"]))
                ret[i["incident_key"]] = i["id"]

    more = resp_json["more"]
    offset = offset + limit

The service dashboard shows N incidents and this code returns less than N incidents because at least one has a duplicate id. The duplicates have an identical id and incident_key, so its like one is just missing. The total number of iterations does match the N dashboard incidents.

The big problem comes from the fact that when I list all the incidents incident_key X does not exists, but when I create a incident using X I get the error saying it does.

Code snippet for creation of incidents:

payload = {
            "incident": {
                "type": "incident",
                "incident_key": "some unique id",
                "title": "something bad",
                "service": {
                    "id": "something",
                    "type": "service_reference"
                "body": {
                    "type": "incident_body",
                    "details": "custom details"
requests.post("https://api.pagerduty.com/incidents", data=json.dumps(payload), headers=headers)

I figured this out. Without the sort_by specified in the query params the API is not deterministic and hence the reason I was seeing duplicate ids returned. So if you plan to paginate through all incidents you must specify sort_by. Hope this helps someone else!

I recommend this value default to something reasonable when not specified.

1 Like

Hey there, thanks for posting your solution, especially regarding the non-deterministic nature of the API output.

There were some some inconsistencies observed between the value of total and the actual total number of incidents being returned (across ~50 pages of output).

Now, the query includes &sort_by=incident_number:asc to address these duplicates.

1 Like

I know I’m late to this thread, but I’d like to recommend the Events API v2 which would be a lot faster and more efficient.

Using the Events API, one wouldn’t need to make a REST API GET plus a synchronous incident creation via POST but just one quick fire-and-forget POST request. The Events API is asynchronous, so you’d get a HTTP response really quick.

The dedup_key field takes the place of incident_key and won’t trigger any new incidents if there’s already one open that matches the key.

The only drawback is that incident key uniqueness check is performed per-service and not globally, so if your API solution requires selecting / triggering on multiple services then the Events API would not suffice. Also, note that the incident or action isn’t applied immediately after the response is issued by the API; it’s asynchronous and will take some time to complete (usually less than 10 seconds and often less than 5 seconds).