bigdata - Python - big list issue -
i need create , search big list, example list of 3.9594086612e+65 arrays of 10x10 (or bigger list of bigger arrays).
i need create list of combinations. after need filter out combinations according rule. process should repeated till 1 combination left.
if try create list crashes because of memory after few minutes.
i can imagine, solution should store data in different way list in memory.
what possible, correct , easy way? sql database? nosql database? or multiple text files opened , closed 1 after one? need run through list multiple times.
hy:
3.9594086612e+65
it's more computer memory in world.
(and not multiply 10)!
Comments
Post a Comment