Hello, everyone. I am a research scientist preparing for a talk. I am creating a Flash app to graphically illustrate the simulations I have been running. The code I have been running these simulations with is in FORTRAN, so I have created a number of output files with relevant data, which are then read in by Flash and stored in arrays. I then step through these arrays to manipulate graphics on screen. I have run into a bit of a problem that I am hoping to get some help with.
My problem is that I would like to be able to slow down and speed up the simulation run at any point. I hope is that I can scale the time step speed from one second of run time = one second of simulation time all the way to one second of run time = one hour of simulation time. At the faster time speeds I would like it to look continuous rather than discretely updating the screen at each hour.
This may sound rather confusing. In other words, suppose there is a digital clock display on screen. I would like it to tick at an adjustable time interval so that at any point while it is running, the user can change the tick rate anywhere from one second to an hour for every second of real time. It is important though that at high tick rates that it actually goes through all the seconds as well, just much faster.
I hope this makes sense. Thank you for reading and any assistance you can offer.