0
点赞
收藏
分享

微信扫一扫

spark dataframe 转换 json

成义随笔 2022-08-10 阅读 81

首先新建一个​​dataframe​

import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.{SQLContext, SparkSession}
import scala.util.parsing.json.{JSON, JSONArray, JSONObject}

val conf = new SparkConf().setAppName("TTyb").setMaster("local")
val sc = new SparkContext(conf)
val spark = new SQLContext(sc)
val testDataFrame = spark.createDataFrame(Seq(
("1", "asf"),
("2", "2143"),
("3", "rfds")
)).toDF("label", "col")
testDataFrame.show()

打印结构是:

+-----+----+
|label| col|
+-----+----+
| 1| asf|
| 2|2143|
| 3|rfds|
+-----+----+

spark 自带函数

val sparkFunction = testDataFrame.toJSON.collectAsList.toString
println(sparkFunction)
// 得到结果
// [{"label":"1","col":"asf"}, {"label":"2","col":"2143"}, {"label":"3","col":"rfds"}]

列表型json

但是如果想得到第一列为​​key​​​,第二列为​​value​​,那么写法是这样子的:

val df2Array: Array[(String, String)] = testDataFrame.collect().map { row => (row(0).toString, row(1).toString) }
val jsonData: Array[JSONObject] = df2Array.map { i =>
new JSONObject(Map(i._1 -> i._2))
}
val jsonArray:JSONArray = new JSONArray(jsonData.toList)
println(jsonArray)
// [{"1" : "asf"}, {"2" : "2143"}, {"3" : "rfds"}]

合并JSONArray key:value

但是上面发现每一个​​key:value​​都放在一个括号里面,怎么把他们合并成起来?只需要文本处理一下:

val df2Array: Array[(String, String)] = testDataFrame.collect().map { row => (row(0).toString, row(1).toString) }
val jsonData: Array[JSONObject] = df2Array.map { i =>
new JSONObject(Map(i._1 -> i._2))
}
val jsTest = jsonData.mkString(",").replace("},{",",")
println(jsTest)
// {"1" : "asf","2" : "2143","3" : "rfds"}

怎么把这个字符串变成​​map​​​通过​​key​​​值来取得​​value​​?定义一下函数即可:

def regJson(json:Option[Any]):Map[String,Any] = json match {
case Some(map:Map[String,Any]) => map
}
println(regJson(JSON.parseFull(jsTest)))
// Map(1 -> asf, 2 -> 2143, 3 -> rfds)

举报

相关推荐

0 条评论