bigdata - Python - big list issue -


i need create , search big list, example list of 3.9594086612e+65 arrays of 10x10 (or bigger list of bigger arrays).

i need create list of combinations. after need filter out combinations according rule. process should repeated till 1 combination left.

if try create list crashes because of memory after few minutes.

i can imagine, solution should store data in different way list in memory.

what possible, correct , easy way? sql database? nosql database? or multiple text files opened , closed 1 after one? need run through list multiple times.

hy:

3.9594086612e+65

it's more computer memory in world.

(and not multiply 10)!


Comments

Popular posts from this blog

sublimetext3 - what keyboard shortcut is to comment/uncomment for this script tag in sublime -

java - No use of nillable="0" in SOAP Webservice -

ubuntu - Laravel 5.2 quickstart guide gives Not Found Error -