You will have to use a table function, probably lookup() to get the prior value for an individual sensor. There are a few things to set on the partitioning and ordering, so it would be helpful if you could post a file with some of the data and expected result.
I really don't think you'll be able to do this with table calculations on that sort of volume of data. In fact I'm sure you won't.
Table calculations rely on pulling all of the rows back to Tableau. The upper limit on the number of rows you can manage depends on various things including row size, but I really don't think you will manage even to retrieve 10 million rows (let alone 500 million) back to Tableau, let alone do table calculations on them.
The are a couple of approaches I would consider for this, depending on how you are storing your data.
If it's in a decent database with analytic functions available I'd look at doing the heavy lifting of the calculation in the database (possibly via a custom SQL connection, though that might place some constraints on the use of analytic functions).
If you don't have a database with analytic functions, the only other way I can think of is to pre-process the data. Personally I'd probably do it by writing a perl script (though the random sequence of arrival might be a challenge with 500 M rows). Another option might be ETL tools - though that's not something I know anything about. Others on the forum know about them, though, so someone else may chime in.
Thanks. We are using Oracle so I am sure we have all the tools. I just need to find someone able to build the queries for me (our DBA is way busy on more important projects than mine).