java - How to convert Spark dataframe output to json? -


i reading file csv file spark sql context.

code :

m.put("path", csv_directory+file.getoriginalfilename()); m.put("inferschema", "true"); // automatically infer data types else string default m.put("header", "true");      // use first line of files header          m.put("delimiter", ";");  dataframe df = sqlcontext.load("com.databricks.spark.csv",m);               df.printschema(); 

fetching column names , data type df.printschema()

o/p :

|--id : integer (nullable = true) |-- applicationno: string (nullable = true) |-- applidate: timestamp(nullable = true) 

what return type of statement printschema. how convert output in json format, how convert data frame json??

desired o/p:

{"column":"id","datatype":"integer"} 

datatype has json() method , fromjson() method can use serialize/deserialize schemas.

val df = sqlcontext.read().....load() val jsonstring:string = df.schema.json() val schema:structtype = datatype.fromjson(jsonstring).asinstanceof[structtype] 

Comments

Popular posts from this blog

sublimetext3 - what keyboard shortcut is to comment/uncomment for this script tag in sublime -

java - No use of nillable="0" in SOAP Webservice -

ubuntu - Laravel 5.2 quickstart guide gives Not Found Error -