In principle you can certainly do that type of calculation in Tableau, using Tableau's Table Calculations. These allow you to compute values by involving multiple rows, in the way that you need here.
The reason I say "in principle" is that Table Calculations are fine as long as the volume of data is manageable. In order to calculate the time differences between transmissions you would need to bring data for each individual transmission back to Tableau. That starts to get very slow with large numbers of rows and eventually will fail as Tableau runs out of memory. It's difficult to put an exact number on the limit because it depends on the nature of your data, but in general I would say it works fine up to tens of thousands of rows, slows down for hundreds of thousands and probably doesn't work for millions. You can sometimes get away with a couple of million (if you don't mind getting a cup of coffee while the viz refreshes).
So depending on the distribution of low and high volume transmitters in your dataset, you might get away with it or you might not.
If you've got too much to do in Table Calculations, another option would be to do the calculation in the SQL used for your data connection (i.e. without defining any new tables). Depending on your database that might be quite be quite straightforward - particularly if your database supports analytical functions (which allow you to do the same sorts of things as Table Calculations, but on the server). I haven't used them much but I seem to remember that there are restrictions on what works in a Tableau custom SQL connection.
Well I do dozens of Calculated Fields on millions of rows, but I guess I have no style. :)
Definitely make an Extract, and then Optimize the Extract so that the Calculated Values (which would probably be Delta-T's in your case) are written back out to the Extract also. As long as you run on a fairly beefy machine, you should be able to scale up fairly huge. Yes it's better to have upstream middleware do this kind of number crunching for you, but that's not always reality.
There's no issue doing calculations on millions of rows - most calculated fields are evaluated by the database server. It's just when you have table calculations which require all the rows to be brought back to Tableau that you run into limitations (and you need to do that to evaluate deltas between rows in calculated fields). As I say, it's hard to say where the limit will be for any given scenario - but ultimately you're limited by the fact that Tableau is a 32 bit application, so can only use so much memory for manipulating the rows.