# Deriving Time to First Event from Longitudinal Data Using Level of Detail Calculations

Version 4
###### Introduction

Longitudinal, or history, data are common in the life sciences, where the data set contains multiple observations of each subject replicated over time. Examples of longitudinal data include credit card transaction histories, health care utilization records, and wildlife monitoring. The minimum components of a longitudinal data set consist of columns identifying the subject, [subject], columns containing the time of the observation, [time], and columns containing the characteristics of the observed event, [event].

A core analysis in longitudinal data is to determine, for each subject, the time to the first occurrence of an event of interest, following the inception of observation of the subject. The derivation of the time to first event per subject from longitudinal data is a pivoting operation, that transforms the data from multiple observation per subject to a single observation per subject. Level of detail calculations allow for this pivoting operation to be carried out directly within Tableau, because we can treat the subject as the level that defines the set of rows over which to calculate. In the following material to construct calculations over each subject we will repeatedly use the idiom { FIXED [subject] : ... }.

###### Analysis

We begin by identifying the first event in the history of each subject, assuming there are no duplicated times for a single subject:

// the first observation time is the smallest observation

// time over all the observations of the subject

[entry time] := { FIXED [subject] : MIN([time]) }

// Flag the observation where the time equals the

// first observation time of the subject

[is entry] := ([time] = [entry time])

Next we need to find the first time the outcome of interest occurred in the history, for each subject, by nulling out all events that are not of interest:

// Flag every observation that has the outcome characteristic of interest

[is candidate] := ([event] = "my flagged event")

// Report only the times of the observations flagged as being of interest

[candidate time] := IIF([is candidate], [time], NULL)

// Overall all the events of the subject find the time of the first event

// reporting the characteristic of interest

[outcome time] := { FIXED [subject] : MIN([candidate time]) }

Finally we can calculate the time to the first event following the start of observation of each subject:

// Simple difference between the start and end times

[duration] := ([outcome time] - [entry time])

Remember, these calculations are reproduced over every record for each subject, so when plotting you will need to filter on [is entry] = TRUE to work with one record per subject.

While this initial example is sufficient for an introduction, typically in the analysis of longitudinal data we are interested in the time to the first change in event characteristics following the initial event. To compute this we use a level of detail calculation to "push" the event characteristic of the initial observation to all the other observations of the subject

Begin by "pulling" the event characteristic out of the initial observation, by nulling out all other observations:

// Return only the event from the first observation

[pull event] := IIF([is entry], [event], NULL)

Then "push" the event characteristic of the initial observation out to all the other observations of the subject, using a superfluous MIN() to guard the level of detail calculation:

// Make the first observation available to all observations

[push event] := { FIXED [subject] : MIN([pull event]) }

Using the "pushed" value of the initial observation we compare that to each other event to find candidate observations that represent a change from the initial observation:

// Compare each subsequent observation to the first observation

[is candidate] := ([event] <> [push event])

From this point onward the calculations follow as before. An example packaged workbook can be found here on GitHub.

###### Notes

1. Depending on the design of the observations some subjects may have no candidate events, in which case we need to incorporate censoring into the calculated fields.
2. By generalizing the level of detail calculations we can create a calculated field to classify the type, or degree, of change from the initial observation, to the observation of first change.
3. Level of detail calculations are computed before any filters, and so will not recompute if you filter your data. Any inclusion or exclusion criteria need to explicitly written into the calculations.
4. All the calculations can be succinctly combined into a single calculated field. They were presented separately to aid with understanding the derivation.
5. This work was supported by Alberta Health Services, in preparation for a series of short lessons on longitudinal analysis in Tableau.