TV & Film I was just watching Tron Legacy...

BakaTheIdiot

Viscount of Spaghetti Code
Roleplay Availability
Roleplay Type(s)
...and I noticed something.

The math on how long Flynn was in the grid was way, WAY off. So off, in fact, that I needed to figure out just how long Flynn was really trapped in the grid, relative to how time passes in the grid.

First, let me break down the math:
There are a thousand milli-cycles in one cycle, and cycles are measured in Hertz, in computer terms. This means that if a computer has a clock speed of one Hertz, that computer cycles once per second. Now, on the 1/1000 level, that's really, really fast.
Flynn specifically states "hours in here were just minutes out there." Here's where the math is hilariously off.

Flynn also states that the "portal" stays open for one millicycle, which is about 8 hours on the grid. Now, assuming there are in fact 1000 millicycles in a single cycle, that would mean that, the computer, to compensate for the hours-to-minutes ratio, that an hour in the grid is an eighth of a thousandth of a cycle, which means that in order for this ratio to be true, one of two things must be true about the computer:

1. It has an impossibly slow clock speed.
2. The ratio of minutes to hours is completely off.

I believe the latter.

Let's say that in real time, 5 minutes passed. This means that would equal, assuming the computer ran at 1 Hz, 300,000 millicycles.

One millicycle is a thousandth of a second, assuming that the 1 Hz per second ratio is being used. This is on the small scale.

Now, an Amiga 1000 computer, and I'm using this as my average processor for 1985 here, clocked at 7.16 Megahertz. That's, in theory, 7,160,000 millicycles in a second. Therefore, the hours grid time to seconds real time ratio goes up dramatically.
Using this spec, assuming that, in fact, Flynn was trapped in a computer grid for... let's say, 25 years, means the following:

Flynn was trapped, grid time, for 6,538.8 years. That's on the SMALL scale.

Going off of an Amiga 1000's clock speed, and the 7.16 million millicycles in a second ratio, here's my math:
1. 7,160,000 * 8 hours = 57,280,000 hours grid time
57,280,000 / 24 (hours in a day) = 2,386,666.6 days grid time
2,386,666.6 / 7 (days in a week) = 340,952.381 weeks grid time
340,952,381 52.1429 (weeks in a year approx.) = 6,538.8 YEARS grid time!!!

But that got me thinking: how slow would a computer have to be in order for 8 hours grid time to equal about 5 minutes real time?

The answer? Really damn slow.

Here's the math on that.

Assuming that 8 hours in a millicycle = 5 minutes real time, this would mean that the computer would have to run at 96 millihertz in a second. This would mean, in theory, you would reach a ratio of 8 hours grid time to 5 minutes real life.

Is this an exact science? No.

Is it even necessarily important? No.

Am I correct in my math? Maybe. I used the calculator on my computer, and did the rest in my head.

Was this fun to find out? I think so, but you tell me.

Did I ruin a childhood or two? Absolutely, and I'm not apologizing.

I'd say this is a film theory, but I'm not a part of that YouTube channel. Cheers!
 
I watched it just the other day... now I need to figure out how long Flynn was in the grid in the FIRST movie!
 

Users who are viewing this thread

Back
Top