i using php read large table(8,00,000 rows,130 columns) hive. getting allocated memory exhausted error.i have seen same question been answered in stackoverflow setting memory_limit=512m(say) in php.ini file.
i did same , restarted service.but still same error pops out.any suggestion of great help.
there several problems:
- your mysql uses memory / resources
- your result list in php large. example if you've collection of large objects, result list can have 8.000.000 objects in it
it's better paginate data. below i'll explain that.
pagination
pagination in query limiting results (`limit from, size'). query looks this:
select foo, bar table limit 0, 100
so small query results, less memory , resource use.
read full story here: http://mysql.rjweb.org/doc.php/pagination
performance
to avoid performance issues, determine archiving data. let's me give example:
- use temp table, within save data (for example) 1 month
- all data, older 1 month, should moved 'archive' table
- read temp table, it's fast
future
you possibly expect more , more data. it's thing further. using elasticsearch
or solr
, can query, filter, analyse , aggregate millions of documents in milliseconds.
Comments
Post a Comment