javascript - How to load "raw" row data into ag-grid -


i'm dealing high throughput problem. goal display, @ least on chrome browser, grid composed 1m of rows.

these rows dynamically fetched python server running on same machine. server has loaded whole dataset in memory. communications between client (the browser) , server (python) take place through websocket. grid has option virtualpaging: true.

so far reach performances loading pages of 100 rows each. despite that, loading whole 1m dataset @ beginning (therefore without remote fetching of rows) , shows significant improvement in scrolling (no "white rows" effect).

i want achieve same performance without storing in browser memory whole dataset.

the first step try avoid conversions steps. client receives server array of arrays, means row model on server "positional" (given r generic row, r[0] element related first column, r[1] second , on). callback function successcallback of ag-grid, require array of objects, means each row takes keys related column names (given r generic row, r["firstcolumn"] element related first column, r["secondcolumn"] second , on).

the second approach totally infeasible server perspective, given huge waste of memory used key-value mechanism. leads need of local conversion each page received client:

client.appendcallback("message", function(message){     message = json.parse(message.data);     switch(message.command) {         case "getrows":             if(message.res.code == 0) {                 var bulk = [];                 var arr = message.res.data;                 (var = 0, len = arr.length; < len; i++) {                     bulk[i] = {"par1" : arr[i][0], "par2" : arr[i][1], "par3" : arr[i][2], "par4" : arr[i][3], "par5" : arr[i][4], "par6" : arr[i][5]};                 }                 _data.successcallback(bulk);             }             break;         default:             break;     } },"ws"); 

what need way pass successcallback rows array , not objects avoiding conversion part, this:

client.appendcallback("message", function(message){     message = json.parse(message.data);     switch(message.command) {         case "getrows":             if(message.res.code == 0) {                 _data.successcallback(message.res.data);             }             break;         default:             break;     } },"ws"); 

any appreciated

what ?

fix pagesize of 100.

since use server side paging have implemented own datasource : when you're asked load data : load [and convert] 10000 rows , store them in memory.

then use own intermediary paging : each time grid ask next rows, either them memory or fetch next 10k rows , [convert and] return 1st hundreth .

the [convert] part choice place conversion operation either when loading server either when asking next 100 rows.

if number of data huge , consider deploy not on local computer, angular (or browsers, don't know is) support gzip compressed data transparently.


Comments

Popular posts from this blog

sublimetext3 - what keyboard shortcut is to comment/uncomment for this script tag in sublime -

java - No use of nillable="0" in SOAP Webservice -

ubuntu - Laravel 5.2 quickstart guide gives Not Found Error -