Does anyone happen to know the best way to store 10,000 rows of data?
Let’s say that I have 10,000 rows, each has 10 fields (random numer), each field on average has 3 bytes of data in them…
totaling 300,000 bytes. or 300KB’s (MINIMUM)
But normally most systems use more than that size, such as mysql, which would probably give that a nice 1mb size.
Here is my problem, I need to call that “table” ALOT… sift through the rows, spit out certain ones, etc… and worse, write back to the file.
MySQL is definatly not an option, as something like this seems to take a long time…
I will probably use fopen and fclose. However, parsing the data string into an array is very laggy… 10,000 peices of data takes .05 seconds, so the entire file would theoretically take 0.5 seconds (faster than mysql, but still slow)
Does anyone know of any alternatives?
something faster than fopen? I’m thinking of going with C#… or something later, but sticking with the laggy 0.5 second thing for now.