Hey Guys and Gals,
I’m playing around with flash on iOS and I’m having a ***** of a time trying to figure stuff out. I’m following Ammar’s method found on here:
http://forums.adobe.com/thread/769247
The part that had me confused was this:
_spriteContainer.bitmapData = _frameBD[_currentFrame]; // where _frameBD is a vector containing the BitmapData objects that hold the animation.
As sprites don’t have bitmapData properties. So what I did was create a sprite that has a bitmap added to that. And then I was sending in different frames of bitmapData to it.
Now the way I handled bitmapdata was having one vector of the stored data. Then as I was creating new sprites, it would pass the sprite the vector bitmapdata. Inside of the vector it had a timer that would update the data using:
bitmap.bitmapData = data_vector[frame_number]
The problem being is that when I got to about 20 instances on my iPhone 3g the new ones stopped animating. Now apparently we use bitmapdata because for some reason it’s automatically added into the iOS GPU? But it looks like I’m running out of memory because new instances over 20 instances don’t animate. So it looks like every single instance bitmap data that is being loaded is chewing up a nice chunk of available memory, when they are all referencing the same vector of bitmapdata.
Anybody have any ideas what I’m doing wrong? How do we set up one set of bitmapdata vector to be used by the gpu only once?
Thanks!