Topic by Adam
Content
Sooo.. I've been refactoring my code for days now, and I finally have to admit that I'm licked. I'm attempting to iterate through my Organizations table and "clean up" the SLAs. My logic is sound, it functions fine in a small environment. The issue only occurs in large queries.. the dreaded: Out of memory.
It starts out fine, but as you can tell from the logging -- the little piggy gobbles up all the dessert:
(19:21:46) Begin processing (19:21:46) Processing rows 1 to 10000 (19:21:50) ** COMMIT ** :: Changes: 0, Rows: 500, Memory: 68943872 (19:21:53) ** COMMIT ** :: Changes: 0, Rows: 1000, Memory: 120061952 (19:21:55) ** COMMIT ** :: Changes: 0, Rows: 1500, Memory: 144703488 (19:21:58) ** COMMIT ** :: Changes: 0, Rows: 2000, Memory: 176422912 (19:22:00) ** COMMIT ** :: Changes: 0, Rows: 2500, Memory: 219938816 (19:22:03) ** COMMIT ** :: Changes: 0, Rows: 3000, Memory: 248774656 (19:22:05) ** COMMIT ** :: Changes: 0, Rows: 3500, Memory: 278921216 Fatal error: Out of memory (allocated 285736960) (tried to allocate 54 bytes) in /cgi-bin/site.cfg/scripts/custom/sla_fix.php on line 46
I've lost count of the number of different (sometimes completely ridiculous) ways I've attacked this. My fall-back is to add an offset argument at runtime and process in blocks rather than letting the script loop (this, however, is admitting defeat -- which leaves a bad taste in my mouth).
So, I submit to the wiser amongst us -- what am I doing wrong? Is my memory (mis)management really that bad? Is this to simply be chalked up "PHP SUX"? Is this an internal class memory leak, completely out of my control? ... Solar flares? :(